Dec 08 20:04:42 crc systemd[1]: Starting Kubernetes Kubelet... Dec 08 20:04:42 crc restorecon[4699]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 20:04:42 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 20:04:43 crc restorecon[4699]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 08 20:04:43 crc restorecon[4699]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 08 20:04:43 crc kubenswrapper[4781]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 08 20:04:43 crc kubenswrapper[4781]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 08 20:04:43 crc kubenswrapper[4781]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 08 20:04:43 crc kubenswrapper[4781]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 08 20:04:43 crc kubenswrapper[4781]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 08 20:04:43 crc kubenswrapper[4781]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.970540 4781 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973746 4781 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973763 4781 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973769 4781 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973774 4781 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973778 4781 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973783 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973788 4781 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973793 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973800 4781 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973805 4781 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973810 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973822 4781 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973827 4781 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973832 4781 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973837 4781 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973841 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973846 4781 feature_gate.go:330] unrecognized feature gate: Example Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973850 4781 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973855 4781 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973859 4781 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973864 4781 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973868 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973873 4781 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973877 4781 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973882 4781 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973888 4781 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973894 4781 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973898 4781 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973903 4781 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973908 4781 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973930 4781 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973936 4781 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973942 4781 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973947 4781 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973952 4781 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973958 4781 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973970 4781 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973977 4781 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973982 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973987 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973992 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.973996 4781 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974001 4781 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974005 4781 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974010 4781 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974016 4781 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974020 4781 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974026 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974030 4781 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974035 4781 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974039 4781 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974044 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974048 4781 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974053 4781 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974057 4781 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974064 4781 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974070 4781 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974075 4781 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974081 4781 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974086 4781 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974091 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974096 4781 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974102 4781 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974107 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974111 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974117 4781 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974122 4781 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974127 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974132 4781 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974137 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.974142 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974236 4781 flags.go:64] FLAG: --address="0.0.0.0" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974248 4781 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974257 4781 flags.go:64] FLAG: --anonymous-auth="true" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974264 4781 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974270 4781 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974276 4781 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974284 4781 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974291 4781 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974296 4781 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974301 4781 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974307 4781 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974313 4781 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974321 4781 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974327 4781 flags.go:64] FLAG: --cgroup-root="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974332 4781 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974337 4781 flags.go:64] FLAG: --client-ca-file="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974342 4781 flags.go:64] FLAG: --cloud-config="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974348 4781 flags.go:64] FLAG: --cloud-provider="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974353 4781 flags.go:64] FLAG: --cluster-dns="[]" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974360 4781 flags.go:64] FLAG: --cluster-domain="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974365 4781 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974371 4781 flags.go:64] FLAG: --config-dir="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974377 4781 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974384 4781 flags.go:64] FLAG: --container-log-max-files="5" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974391 4781 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974396 4781 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974401 4781 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974407 4781 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974413 4781 flags.go:64] FLAG: --contention-profiling="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974418 4781 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974424 4781 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974430 4781 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974435 4781 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974442 4781 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974447 4781 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974453 4781 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974458 4781 flags.go:64] FLAG: --enable-load-reader="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974464 4781 flags.go:64] FLAG: --enable-server="true" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974469 4781 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974476 4781 flags.go:64] FLAG: --event-burst="100" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974481 4781 flags.go:64] FLAG: --event-qps="50" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974486 4781 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974492 4781 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974497 4781 flags.go:64] FLAG: --eviction-hard="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974505 4781 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974511 4781 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974516 4781 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974522 4781 flags.go:64] FLAG: --eviction-soft="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974530 4781 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974534 4781 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974540 4781 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974544 4781 flags.go:64] FLAG: --experimental-mounter-path="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974549 4781 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974554 4781 flags.go:64] FLAG: --fail-swap-on="true" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974559 4781 flags.go:64] FLAG: --feature-gates="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974566 4781 flags.go:64] FLAG: --file-check-frequency="20s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974571 4781 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974576 4781 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974582 4781 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974588 4781 flags.go:64] FLAG: --healthz-port="10248" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974593 4781 flags.go:64] FLAG: --help="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974598 4781 flags.go:64] FLAG: --hostname-override="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974605 4781 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974611 4781 flags.go:64] FLAG: --http-check-frequency="20s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974616 4781 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974622 4781 flags.go:64] FLAG: --image-credential-provider-config="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974627 4781 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974632 4781 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974637 4781 flags.go:64] FLAG: --image-service-endpoint="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974643 4781 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974648 4781 flags.go:64] FLAG: --kube-api-burst="100" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974654 4781 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974659 4781 flags.go:64] FLAG: --kube-api-qps="50" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974664 4781 flags.go:64] FLAG: --kube-reserved="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974670 4781 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974675 4781 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974681 4781 flags.go:64] FLAG: --kubelet-cgroups="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974686 4781 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974691 4781 flags.go:64] FLAG: --lock-file="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974696 4781 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974701 4781 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974707 4781 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974716 4781 flags.go:64] FLAG: --log-json-split-stream="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974721 4781 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974727 4781 flags.go:64] FLAG: --log-text-split-stream="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974732 4781 flags.go:64] FLAG: --logging-format="text" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974737 4781 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974744 4781 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974750 4781 flags.go:64] FLAG: --manifest-url="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974755 4781 flags.go:64] FLAG: --manifest-url-header="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974762 4781 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974768 4781 flags.go:64] FLAG: --max-open-files="1000000" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974775 4781 flags.go:64] FLAG: --max-pods="110" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974780 4781 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974786 4781 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974791 4781 flags.go:64] FLAG: --memory-manager-policy="None" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974797 4781 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974802 4781 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974808 4781 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974814 4781 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974829 4781 flags.go:64] FLAG: --node-status-max-images="50" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974835 4781 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974840 4781 flags.go:64] FLAG: --oom-score-adj="-999" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974846 4781 flags.go:64] FLAG: --pod-cidr="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974851 4781 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974860 4781 flags.go:64] FLAG: --pod-manifest-path="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974866 4781 flags.go:64] FLAG: --pod-max-pids="-1" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974871 4781 flags.go:64] FLAG: --pods-per-core="0" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974877 4781 flags.go:64] FLAG: --port="10250" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974882 4781 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974887 4781 flags.go:64] FLAG: --provider-id="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974893 4781 flags.go:64] FLAG: --qos-reserved="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974898 4781 flags.go:64] FLAG: --read-only-port="10255" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974903 4781 flags.go:64] FLAG: --register-node="true" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974909 4781 flags.go:64] FLAG: --register-schedulable="true" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974933 4781 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974944 4781 flags.go:64] FLAG: --registry-burst="10" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974949 4781 flags.go:64] FLAG: --registry-qps="5" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974955 4781 flags.go:64] FLAG: --reserved-cpus="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974961 4781 flags.go:64] FLAG: --reserved-memory="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974969 4781 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974975 4781 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974981 4781 flags.go:64] FLAG: --rotate-certificates="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974986 4781 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974991 4781 flags.go:64] FLAG: --runonce="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.974996 4781 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975002 4781 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975008 4781 flags.go:64] FLAG: --seccomp-default="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975013 4781 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975018 4781 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975024 4781 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975029 4781 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975035 4781 flags.go:64] FLAG: --storage-driver-password="root" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975040 4781 flags.go:64] FLAG: --storage-driver-secure="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975045 4781 flags.go:64] FLAG: --storage-driver-table="stats" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975050 4781 flags.go:64] FLAG: --storage-driver-user="root" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975055 4781 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975061 4781 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975066 4781 flags.go:64] FLAG: --system-cgroups="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975071 4781 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975080 4781 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975085 4781 flags.go:64] FLAG: --tls-cert-file="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975091 4781 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975099 4781 flags.go:64] FLAG: --tls-min-version="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975104 4781 flags.go:64] FLAG: --tls-private-key-file="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975109 4781 flags.go:64] FLAG: --topology-manager-policy="none" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975115 4781 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975120 4781 flags.go:64] FLAG: --topology-manager-scope="container" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975126 4781 flags.go:64] FLAG: --v="2" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975134 4781 flags.go:64] FLAG: --version="false" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975141 4781 flags.go:64] FLAG: --vmodule="" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975147 4781 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975152 4781 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975276 4781 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975284 4781 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975290 4781 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975295 4781 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975300 4781 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975305 4781 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975311 4781 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975317 4781 feature_gate.go:330] unrecognized feature gate: Example Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975321 4781 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975326 4781 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975332 4781 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975338 4781 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975343 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975349 4781 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975353 4781 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975358 4781 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975363 4781 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975368 4781 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975372 4781 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975377 4781 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975381 4781 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975385 4781 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975390 4781 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975394 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975399 4781 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975404 4781 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975409 4781 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975413 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975418 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975423 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975427 4781 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975432 4781 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975436 4781 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975441 4781 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975446 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975450 4781 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975454 4781 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975459 4781 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975463 4781 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975469 4781 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975474 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975479 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975484 4781 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975490 4781 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975495 4781 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975499 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975504 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975508 4781 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975514 4781 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975519 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975524 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975529 4781 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975535 4781 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975539 4781 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975544 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975548 4781 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975554 4781 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975558 4781 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975564 4781 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975568 4781 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975573 4781 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975578 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975583 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975587 4781 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975592 4781 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975597 4781 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975601 4781 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975605 4781 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975610 4781 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975615 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.975619 4781 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.975783 4781 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.984453 4781 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.984497 4781 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.985844 4781 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.985911 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.985965 4781 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.985977 4781 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.985987 4781 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.985999 4781 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986008 4781 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986016 4781 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986024 4781 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986032 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986039 4781 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986048 4781 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986056 4781 feature_gate.go:330] unrecognized feature gate: Example Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986065 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986073 4781 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986081 4781 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986089 4781 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986097 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986107 4781 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986115 4781 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986123 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986133 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986141 4781 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986149 4781 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986157 4781 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986165 4781 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986174 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986181 4781 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986189 4781 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986201 4781 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986211 4781 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986219 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986236 4781 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986245 4781 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986255 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986264 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986272 4781 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986281 4781 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986289 4781 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986298 4781 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986306 4781 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986314 4781 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986322 4781 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986329 4781 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986337 4781 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986345 4781 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986353 4781 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986360 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986368 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986375 4781 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986386 4781 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986396 4781 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986406 4781 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986414 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986423 4781 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986431 4781 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986440 4781 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986449 4781 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986458 4781 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986466 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986473 4781 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986481 4781 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986488 4781 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986496 4781 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986505 4781 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986516 4781 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986525 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986537 4781 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986547 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986556 4781 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986564 4781 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.986578 4781 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986829 4781 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986846 4781 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986856 4781 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986865 4781 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986875 4781 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986884 4781 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986893 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986901 4781 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986911 4781 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986962 4781 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986972 4781 feature_gate.go:330] unrecognized feature gate: Example Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986983 4781 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986991 4781 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.986999 4781 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987007 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987014 4781 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987024 4781 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987037 4781 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987048 4781 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987059 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987069 4781 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987079 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987089 4781 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987098 4781 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987108 4781 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987117 4781 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987127 4781 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987137 4781 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987151 4781 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987165 4781 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987177 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987188 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987198 4781 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987207 4781 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987216 4781 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987224 4781 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987232 4781 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987241 4781 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987249 4781 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987258 4781 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987266 4781 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987275 4781 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987283 4781 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987290 4781 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987303 4781 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987316 4781 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987328 4781 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987338 4781 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987351 4781 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987364 4781 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987375 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987385 4781 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987396 4781 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987406 4781 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987416 4781 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987426 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987436 4781 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987446 4781 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987456 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987465 4781 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987476 4781 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987485 4781 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987495 4781 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987509 4781 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987519 4781 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987529 4781 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987539 4781 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987549 4781 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987558 4781 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987568 4781 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 08 20:04:43 crc kubenswrapper[4781]: W1208 20:04:43.987578 4781 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.987594 4781 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.987961 4781 server.go:940] "Client rotation is on, will bootstrap in background" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.993470 4781 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.993622 4781 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.994524 4781 server.go:997] "Starting client certificate rotation" Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.994555 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.995045 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-09 17:48:37.500241187 +0000 UTC Dec 08 20:04:43 crc kubenswrapper[4781]: I1208 20:04:43.995163 4781 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 21h43m53.505083535s for next certificate rotation Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.002650 4781 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.006350 4781 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.016662 4781 log.go:25] "Validated CRI v1 runtime API" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.043315 4781 log.go:25] "Validated CRI v1 image API" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.045649 4781 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.048569 4781 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-08-20-00-41-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.048603 4781 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.064829 4781 manager.go:217] Machine: {Timestamp:2025-12-08 20:04:44.063253296 +0000 UTC m=+0.214536703 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:b452ca71-9514-4136-b425-cea2dc682adc BootID:d25e13ca-71aa-4676-b902-e9f68902b8c8 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2e:c0:69 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:2e:c0:69 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:fa:78:52 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:84:5b:3f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6c:20:3e Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:2b:cf:22 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:16:db:e7:9b:f8:9c Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:9e:0f:84:f9:e8:29 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.065129 4781 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.065301 4781 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.066080 4781 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.066310 4781 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.066355 4781 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.066622 4781 topology_manager.go:138] "Creating topology manager with none policy" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.066640 4781 container_manager_linux.go:303] "Creating device plugin manager" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.066889 4781 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.066979 4781 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.067199 4781 state_mem.go:36] "Initialized new in-memory state store" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.067318 4781 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.068276 4781 kubelet.go:418] "Attempting to sync node with API server" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.068305 4781 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.068349 4781 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.068369 4781 kubelet.go:324] "Adding apiserver pod source" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.068385 4781 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.071376 4781 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.071888 4781 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.073309 4781 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.073981 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.074100 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 08 20:04:44 crc kubenswrapper[4781]: W1208 20:04:44.073987 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.074173 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.074367 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.074313 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.074446 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.074493 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.074505 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.074522 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.074533 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.074542 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.074555 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 08 20:04:44 crc kubenswrapper[4781]: W1208 20:04:44.074043 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.074564 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.074629 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.075202 4781 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.075894 4781 server.go:1280] "Started kubelet" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.076031 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.076314 4781 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.076318 4781 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.077035 4781 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.078039 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.078077 4781 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.078143 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 05:06:13.828247571 +0000 UTC Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.078360 4781 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.078381 4781 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.078498 4781 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.078209 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.159:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187f5622413bf3a2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-08 20:04:44.07584861 +0000 UTC m=+0.227131997,LastTimestamp:2025-12-08 20:04:44.07584861 +0000 UTC m=+0.227131997,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 08 20:04:44 crc systemd[1]: Started Kubernetes Kubelet. Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.079209 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 08 20:04:44 crc kubenswrapper[4781]: W1208 20:04:44.079208 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.079276 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.079615 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="200ms" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.080183 4781 server.go:460] "Adding debug handlers to kubelet server" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.087765 4781 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.088025 4781 factory.go:55] Registering systemd factory Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.088153 4781 factory.go:221] Registration of the systemd container factory successfully Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.089099 4781 factory.go:153] Registering CRI-O factory Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.089228 4781 factory.go:221] Registration of the crio container factory successfully Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.089359 4781 factory.go:103] Registering Raw factory Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.089473 4781 manager.go:1196] Started watching for new ooms in manager Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.089774 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.089843 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.089869 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.089883 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090422 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090440 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090456 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090470 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090487 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090500 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090513 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090528 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090541 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090561 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090577 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090593 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090611 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090627 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090640 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090653 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090666 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090679 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090692 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090757 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090777 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090794 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090815 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090832 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090845 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090858 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090870 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090883 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090895 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090908 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090947 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090959 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090972 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090984 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.090996 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091009 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091022 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091040 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091053 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091066 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091079 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091092 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091106 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091120 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091132 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091140 4781 manager.go:319] Starting recovery of all containers Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091146 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091337 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091370 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091393 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091409 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091452 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091468 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091483 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091497 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091510 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091523 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091539 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091553 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091566 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091579 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091592 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091672 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091685 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091698 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091713 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091732 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091749 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091767 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091784 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091801 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091817 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091835 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091853 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091871 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091887 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091904 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091950 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091970 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.091990 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.092008 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.092026 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.092044 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.092057 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.092070 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.092087 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.092107 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.092125 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.092142 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.092161 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.092179 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094063 4781 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094167 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094294 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094313 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094327 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094418 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094475 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094559 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094576 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094640 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094654 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094713 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094740 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094820 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094837 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094876 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094893 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094908 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094954 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094969 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094984 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.094999 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.095039 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.095052 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.095066 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.095078 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098615 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098679 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098701 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098720 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098737 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098751 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098764 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098782 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098794 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098809 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098823 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098838 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098854 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098868 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098881 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098893 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098906 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098937 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.098983 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099001 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099041 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099057 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099070 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099084 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099096 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099109 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099124 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099138 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099152 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099164 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099176 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099190 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099204 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099217 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099233 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099245 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099260 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099272 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099285 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099298 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099309 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099321 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099334 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099347 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099362 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099375 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099388 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099403 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099418 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099433 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099445 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099458 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099471 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099483 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099495 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099506 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099522 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099537 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099549 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099561 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099575 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099590 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099607 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099621 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099637 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099656 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099670 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099685 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099703 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099719 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099739 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099767 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099784 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099799 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099812 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099824 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099841 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099853 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099869 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099884 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099901 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099945 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099959 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099971 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099988 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.099999 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.100012 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.100025 4781 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.100037 4781 reconstruct.go:97] "Volume reconstruction finished" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.100047 4781 reconciler.go:26] "Reconciler: start to sync state" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.110188 4781 manager.go:324] Recovery completed Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.118771 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.120407 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.120438 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.120447 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.121183 4781 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.121199 4781 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.121216 4781 state_mem.go:36] "Initialized new in-memory state store" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.123195 4781 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.124543 4781 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.124584 4781 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.124616 4781 kubelet.go:2335] "Starting kubelet main sync loop" Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.124666 4781 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 08 20:04:44 crc kubenswrapper[4781]: W1208 20:04:44.159714 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.159825 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.165692 4781 policy_none.go:49] "None policy: Start" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.166638 4781 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.166694 4781 state_mem.go:35] "Initializing new in-memory state store" Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.180265 4781 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.224806 4781 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.227194 4781 manager.go:334] "Starting Device Plugin manager" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.227978 4781 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.228025 4781 server.go:79] "Starting device plugin registration server" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.228631 4781 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.228655 4781 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.229031 4781 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.229180 4781 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.229190 4781 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.237777 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.280226 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="400ms" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.329270 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.332135 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.332199 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.332224 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.332275 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.333108 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.159:6443: connect: connection refused" node="crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.425894 4781 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.426134 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.427733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.427780 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.427790 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.427972 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.428562 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.428603 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.429260 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.429285 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.429294 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.429650 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.429672 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.429683 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.429776 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.430179 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.430203 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.431072 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.431089 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.431098 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.431185 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.431548 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.431571 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.431982 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.431995 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.432002 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.432230 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.432241 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.432248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.432340 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.432698 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.432723 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.433121 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.433137 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.433145 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.433707 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.433723 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.433732 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.433877 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.433904 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.434284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.434300 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.434310 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.434714 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.434730 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.434741 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.508480 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.508535 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.508572 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.508601 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.508712 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.508770 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.508807 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.508849 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.508898 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.508971 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.509050 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.509093 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.509261 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.509297 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.509389 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.533439 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.534854 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.534965 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.534984 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.535049 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.535475 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.159:6443: connect: connection refused" node="crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611065 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611137 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611167 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611197 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611228 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611263 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611303 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611315 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611344 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611379 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611376 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611408 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611428 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611463 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611481 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611489 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611514 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611528 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611586 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611564 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611577 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611751 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611642 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611570 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611651 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611544 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611809 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611850 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611873 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.611905 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.681457 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="800ms" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.768350 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.791409 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: W1208 20:04:44.808706 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a859d04824667db79357283664c8245647f69b45d5b70d8feb35a5a05be2106e WatchSource:0}: Error finding container a859d04824667db79357283664c8245647f69b45d5b70d8feb35a5a05be2106e: Status 404 returned error can't find the container with id a859d04824667db79357283664c8245647f69b45d5b70d8feb35a5a05be2106e Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.818703 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: W1208 20:04:44.820203 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-0460caab0ad6a555afd2d2f293139429bc4f899a86ad387cbce219854e24b829 WatchSource:0}: Error finding container 0460caab0ad6a555afd2d2f293139429bc4f899a86ad387cbce219854e24b829: Status 404 returned error can't find the container with id 0460caab0ad6a555afd2d2f293139429bc4f899a86ad387cbce219854e24b829 Dec 08 20:04:44 crc kubenswrapper[4781]: W1208 20:04:44.844265 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-76a2037636b484ba56054216041413d97039bc152165933838870c2083c6d687 WatchSource:0}: Error finding container 76a2037636b484ba56054216041413d97039bc152165933838870c2083c6d687: Status 404 returned error can't find the container with id 76a2037636b484ba56054216041413d97039bc152165933838870c2083c6d687 Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.847339 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.859571 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 08 20:04:44 crc kubenswrapper[4781]: W1208 20:04:44.870833 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3d8c722419bcf0dbbf697798b2ac1ee2845dd6ca37f15887edda3f0091224118 WatchSource:0}: Error finding container 3d8c722419bcf0dbbf697798b2ac1ee2845dd6ca37f15887edda3f0091224118: Status 404 returned error can't find the container with id 3d8c722419bcf0dbbf697798b2ac1ee2845dd6ca37f15887edda3f0091224118 Dec 08 20:04:44 crc kubenswrapper[4781]: W1208 20:04:44.875406 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-7ee8f627812ce1c2ed1f217bc130c6ac9e1f1bdccc15d33bf1b6fba9b601bdbd WatchSource:0}: Error finding container 7ee8f627812ce1c2ed1f217bc130c6ac9e1f1bdccc15d33bf1b6fba9b601bdbd: Status 404 returned error can't find the container with id 7ee8f627812ce1c2ed1f217bc130c6ac9e1f1bdccc15d33bf1b6fba9b601bdbd Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.936606 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.937674 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.937712 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.937725 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:44 crc kubenswrapper[4781]: I1208 20:04:44.937752 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 08 20:04:44 crc kubenswrapper[4781]: E1208 20:04:44.938188 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.159:6443: connect: connection refused" node="crc" Dec 08 20:04:45 crc kubenswrapper[4781]: I1208 20:04:45.077139 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 08 20:04:45 crc kubenswrapper[4781]: I1208 20:04:45.079300 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 02:59:30.299797192 +0000 UTC Dec 08 20:04:45 crc kubenswrapper[4781]: I1208 20:04:45.127877 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3d8c722419bcf0dbbf697798b2ac1ee2845dd6ca37f15887edda3f0091224118"} Dec 08 20:04:45 crc kubenswrapper[4781]: I1208 20:04:45.130062 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"76a2037636b484ba56054216041413d97039bc152165933838870c2083c6d687"} Dec 08 20:04:45 crc kubenswrapper[4781]: I1208 20:04:45.133939 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0460caab0ad6a555afd2d2f293139429bc4f899a86ad387cbce219854e24b829"} Dec 08 20:04:45 crc kubenswrapper[4781]: I1208 20:04:45.135997 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a859d04824667db79357283664c8245647f69b45d5b70d8feb35a5a05be2106e"} Dec 08 20:04:45 crc kubenswrapper[4781]: I1208 20:04:45.136829 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7ee8f627812ce1c2ed1f217bc130c6ac9e1f1bdccc15d33bf1b6fba9b601bdbd"} Dec 08 20:04:45 crc kubenswrapper[4781]: W1208 20:04:45.140460 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 08 20:04:45 crc kubenswrapper[4781]: E1208 20:04:45.140527 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 08 20:04:45 crc kubenswrapper[4781]: W1208 20:04:45.335662 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 08 20:04:45 crc kubenswrapper[4781]: E1208 20:04:45.335985 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 08 20:04:45 crc kubenswrapper[4781]: W1208 20:04:45.378713 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 08 20:04:45 crc kubenswrapper[4781]: E1208 20:04:45.378778 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 08 20:04:45 crc kubenswrapper[4781]: E1208 20:04:45.482122 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="1.6s" Dec 08 20:04:45 crc kubenswrapper[4781]: W1208 20:04:45.628822 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 08 20:04:45 crc kubenswrapper[4781]: E1208 20:04:45.628881 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.159:6443: connect: connection refused" logger="UnhandledError" Dec 08 20:04:45 crc kubenswrapper[4781]: I1208 20:04:45.738524 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:45 crc kubenswrapper[4781]: I1208 20:04:45.739774 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:45 crc kubenswrapper[4781]: I1208 20:04:45.739802 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:45 crc kubenswrapper[4781]: I1208 20:04:45.739813 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:45 crc kubenswrapper[4781]: I1208 20:04:45.739832 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 08 20:04:45 crc kubenswrapper[4781]: E1208 20:04:45.740226 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.159:6443: connect: connection refused" node="crc" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.077165 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.159:6443: connect: connection refused Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.079391 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 22:45:51.153161519 +0000 UTC Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.079438 4781 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 842h41m5.073725393s for next certificate rotation Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.141155 4781 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="bade44cb35be2dff3f2f4704f76eda71fec70287ccb1d443f2d8e740a4a4a627" exitCode=0 Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.141235 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"bade44cb35be2dff3f2f4704f76eda71fec70287ccb1d443f2d8e740a4a4a627"} Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.141243 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.141996 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.142022 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.142031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.143175 4781 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810" exitCode=0 Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.143232 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810"} Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.143245 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.144563 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.144609 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.144630 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.149228 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.150199 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.150236 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.150248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.148744 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84"} Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.151125 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e"} Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.151142 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04"} Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.151181 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b"} Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.152237 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b" exitCode=0 Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.152288 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b"} Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.152417 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.153304 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.153326 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.153334 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.155040 4781 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139" exitCode=0 Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.155067 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139"} Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.155132 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.155651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.155670 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.155678 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.160353 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.172093 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.172124 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:46 crc kubenswrapper[4781]: I1208 20:04:46.172134 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.158940 4781 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20" exitCode=0 Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.159005 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20"} Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.159107 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.159767 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.159790 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.159798 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.169005 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"75d40862597dbf91e845e78cefb4931fa6babeb03b810c71242b1ca5da77b02d"} Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.169103 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.169857 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.169878 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.169886 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.176573 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.176700 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"72af9e3ecbbd8f93747689fc6479f7c379fa06452e6fcb6f7403eca55afaaa2d"} Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.176832 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3ebffab9e6d8c675362193da6ec529e34f1fe6de539e6835f84e501b6f35138a"} Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.176948 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"595a833e27a2ce019e5a2c06b12cf8c3f36fb8377adbfab0a3ecb22adbe3a2c6"} Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.178855 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.178876 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.178885 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.180744 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.181078 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.181242 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb"} Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.181271 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2"} Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.181284 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01"} Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.181295 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34"} Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.181306 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d"} Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.181872 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.181906 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.181938 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.181977 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.181998 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.182009 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.341351 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.342397 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.342435 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.342444 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.342467 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 08 20:04:47 crc kubenswrapper[4781]: I1208 20:04:47.376489 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.185151 4781 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f" exitCode=0 Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.185194 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f"} Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.185239 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.185227 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.185266 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.185814 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.185840 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.185855 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.185999 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.186344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.186364 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.186372 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.186595 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.186646 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.186655 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.186684 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.186704 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.186726 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.186736 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.186663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.186850 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.187253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.187284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.187295 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.940092 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:04:48 crc kubenswrapper[4781]: I1208 20:04:48.944346 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.193840 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500"} Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.193899 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50"} Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.193939 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.194014 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.193943 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8"} Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.194034 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.194047 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d"} Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.194065 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e"} Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.195487 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.195520 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.195530 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.195564 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.195588 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.195598 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.195515 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.195639 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.195651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:49 crc kubenswrapper[4781]: I1208 20:04:49.573618 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.197738 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.197794 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.197965 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.199820 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.199881 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.199822 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.199899 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.199954 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.199979 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.521782 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.607688 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.607872 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.609614 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.609679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:50 crc kubenswrapper[4781]: I1208 20:04:50.609699 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.032911 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.123725 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.123984 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.125653 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.125727 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.125754 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.199567 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.199654 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.199666 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.200950 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.200997 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.201015 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.201101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.201136 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.201149 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:51 crc kubenswrapper[4781]: I1208 20:04:51.345768 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:04:52 crc kubenswrapper[4781]: I1208 20:04:52.203971 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:52 crc kubenswrapper[4781]: I1208 20:04:52.205424 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:52 crc kubenswrapper[4781]: I1208 20:04:52.205510 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:52 crc kubenswrapper[4781]: I1208 20:04:52.205536 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:53 crc kubenswrapper[4781]: I1208 20:04:53.206878 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:53 crc kubenswrapper[4781]: I1208 20:04:53.207829 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:53 crc kubenswrapper[4781]: I1208 20:04:53.207866 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:53 crc kubenswrapper[4781]: I1208 20:04:53.207876 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:54 crc kubenswrapper[4781]: I1208 20:04:54.033510 4781 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 08 20:04:54 crc kubenswrapper[4781]: I1208 20:04:54.033652 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 08 20:04:54 crc kubenswrapper[4781]: E1208 20:04:54.238018 4781 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 08 20:04:56 crc kubenswrapper[4781]: I1208 20:04:56.090881 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 08 20:04:56 crc kubenswrapper[4781]: I1208 20:04:56.091062 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:04:56 crc kubenswrapper[4781]: I1208 20:04:56.092373 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:04:56 crc kubenswrapper[4781]: I1208 20:04:56.092413 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:04:56 crc kubenswrapper[4781]: I1208 20:04:56.092425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:04:57 crc kubenswrapper[4781]: I1208 20:04:57.077041 4781 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 08 20:04:57 crc kubenswrapper[4781]: E1208 20:04:57.082524 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 08 20:04:57 crc kubenswrapper[4781]: E1208 20:04:57.343652 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 08 20:04:57 crc kubenswrapper[4781]: I1208 20:04:57.376826 4781 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 08 20:04:57 crc kubenswrapper[4781]: I1208 20:04:57.376958 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 08 20:04:57 crc kubenswrapper[4781]: W1208 20:04:57.411949 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 08 20:04:57 crc kubenswrapper[4781]: I1208 20:04:57.412053 4781 trace.go:236] Trace[43741544]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Dec-2025 20:04:47.410) (total time: 10001ms): Dec 08 20:04:57 crc kubenswrapper[4781]: Trace[43741544]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:04:57.411) Dec 08 20:04:57 crc kubenswrapper[4781]: Trace[43741544]: [10.001888731s] [10.001888731s] END Dec 08 20:04:57 crc kubenswrapper[4781]: E1208 20:04:57.412080 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 08 20:04:57 crc kubenswrapper[4781]: W1208 20:04:57.486900 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 08 20:04:57 crc kubenswrapper[4781]: I1208 20:04:57.487072 4781 trace.go:236] Trace[1614244722]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Dec-2025 20:04:47.485) (total time: 10001ms): Dec 08 20:04:57 crc kubenswrapper[4781]: Trace[1614244722]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:04:57.486) Dec 08 20:04:57 crc kubenswrapper[4781]: Trace[1614244722]: [10.001502351s] [10.001502351s] END Dec 08 20:04:57 crc kubenswrapper[4781]: E1208 20:04:57.487105 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 08 20:04:57 crc kubenswrapper[4781]: W1208 20:04:57.864146 4781 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 08 20:04:57 crc kubenswrapper[4781]: I1208 20:04:57.864234 4781 trace.go:236] Trace[1823646311]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Dec-2025 20:04:47.862) (total time: 10001ms): Dec 08 20:04:57 crc kubenswrapper[4781]: Trace[1823646311]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:04:57.864) Dec 08 20:04:57 crc kubenswrapper[4781]: Trace[1823646311]: [10.001998135s] [10.001998135s] END Dec 08 20:04:57 crc kubenswrapper[4781]: E1208 20:04:57.864259 4781 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 08 20:04:58 crc kubenswrapper[4781]: I1208 20:04:58.072790 4781 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 08 20:04:58 crc kubenswrapper[4781]: I1208 20:04:58.072850 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 08 20:05:00 crc kubenswrapper[4781]: I1208 20:05:00.543756 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:05:00 crc kubenswrapper[4781]: I1208 20:05:00.545708 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:00 crc kubenswrapper[4781]: I1208 20:05:00.545768 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:00 crc kubenswrapper[4781]: I1208 20:05:00.545789 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:00 crc kubenswrapper[4781]: I1208 20:05:00.545825 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 08 20:05:00 crc kubenswrapper[4781]: E1208 20:05:00.549370 4781 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 08 20:05:01 crc kubenswrapper[4781]: I1208 20:05:01.349737 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:05:01 crc kubenswrapper[4781]: I1208 20:05:01.349895 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:05:01 crc kubenswrapper[4781]: I1208 20:05:01.351058 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:01 crc kubenswrapper[4781]: I1208 20:05:01.351119 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:01 crc kubenswrapper[4781]: I1208 20:05:01.351140 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:01 crc kubenswrapper[4781]: I1208 20:05:01.874716 4781 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 08 20:05:02 crc kubenswrapper[4781]: I1208 20:05:02.382126 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:05:02 crc kubenswrapper[4781]: I1208 20:05:02.382350 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:05:02 crc kubenswrapper[4781]: I1208 20:05:02.383690 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:02 crc kubenswrapper[4781]: I1208 20:05:02.383761 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:02 crc kubenswrapper[4781]: I1208 20:05:02.383786 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:02 crc kubenswrapper[4781]: I1208 20:05:02.388369 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:05:02 crc kubenswrapper[4781]: I1208 20:05:02.690818 4781 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.066964 4781 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.068771 4781 trace.go:236] Trace[1510302251]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Dec-2025 20:04:48.292) (total time: 14776ms): Dec 08 20:05:03 crc kubenswrapper[4781]: Trace[1510302251]: ---"Objects listed" error: 14776ms (20:05:03.068) Dec 08 20:05:03 crc kubenswrapper[4781]: Trace[1510302251]: [14.776158443s] [14.776158443s] END Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.068808 4781 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.079617 4781 apiserver.go:52] "Watching apiserver" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.082983 4781 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.083304 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.083715 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.083788 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.083890 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.083959 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.084051 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.084158 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.084371 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.084433 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.084531 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.088556 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.088646 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.088682 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.106407 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.106645 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.106815 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.107024 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.107113 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.107235 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.135225 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.179502 4781 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.211080 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.227241 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.230945 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.232370 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb" exitCode=255 Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.232424 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb"} Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.235908 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.239246 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.239414 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.251023 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.259307 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.268784 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.268827 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.268844 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.268869 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.268884 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.268901 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.268942 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.268958 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.268973 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.268989 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269004 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269020 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269036 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269049 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269065 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269130 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269150 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269179 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269194 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269209 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269243 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269289 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269314 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269333 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269356 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269376 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269399 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269421 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269446 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269460 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269480 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269501 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269526 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269544 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269563 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269595 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269610 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269623 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269746 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269768 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269787 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269806 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269804 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269823 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269892 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269913 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269949 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269965 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269980 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.269998 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270015 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270033 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270050 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270067 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270091 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270107 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270122 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270138 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270156 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270176 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270196 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270214 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270232 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270248 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270264 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270282 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270298 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270316 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270332 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270328 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270348 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270390 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270392 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270397 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270409 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270477 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270482 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270518 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270498 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270540 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270549 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270570 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270593 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270617 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270650 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270682 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270708 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270731 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270758 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270779 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270799 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270821 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270842 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270864 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270885 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270905 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270946 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270969 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270991 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271014 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271034 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271056 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271080 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271103 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271124 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271146 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271177 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271200 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271222 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271246 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271315 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271339 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271360 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271382 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271407 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271432 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271455 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271480 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271504 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271525 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271547 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271571 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271593 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271616 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271640 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271697 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271721 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271744 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271766 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271790 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271814 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271850 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271875 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271898 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271940 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271965 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271987 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272010 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272033 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272059 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272080 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272102 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272128 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272152 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272176 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272199 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272224 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272247 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272270 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272292 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272314 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272337 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272360 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272386 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272410 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272432 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272458 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272482 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272504 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272547 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272570 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272594 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272617 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272639 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272662 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272683 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272708 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272732 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272756 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272778 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272799 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272823 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272846 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272868 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272891 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272948 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272972 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272994 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273017 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273039 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273060 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273081 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273107 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273131 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273152 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273175 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273198 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273221 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273244 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273268 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273290 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273313 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273338 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273361 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273382 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273404 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273428 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273452 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273475 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273499 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273525 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273548 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273570 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273596 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273618 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273661 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273689 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273717 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273743 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273773 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273798 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273826 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273855 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273881 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273967 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273995 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274024 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274048 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274072 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274141 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274158 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274172 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274185 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274199 4781 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274212 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274224 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274234 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274249 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274262 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270649 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270696 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.281696 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270704 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270795 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270851 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270885 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270905 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270945 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270964 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.270969 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271065 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271097 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271139 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271260 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271313 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271357 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271414 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271432 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271451 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271447 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271495 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271559 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271620 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271666 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271692 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271813 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271824 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271899 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271934 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.271997 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272093 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272198 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272294 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272340 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272357 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272425 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272513 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.272758 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273811 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.273983 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274039 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274320 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274326 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274500 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274626 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.274832 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.275038 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.275057 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.275253 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.275300 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.275628 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.276599 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.277051 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.277298 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.277357 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.277475 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.277568 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.277808 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.277846 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.278172 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.278469 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.278901 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.279295 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.280354 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.280983 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.281024 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:05:03.781005379 +0000 UTC m=+19.932288756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.282126 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.281163 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.281174 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.281347 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.281354 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.281532 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.281592 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.281635 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.281757 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.282446 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.282614 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.282617 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.282707 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.282841 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.283112 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.283244 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.283266 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.283529 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.283582 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.283839 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.283913 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.284273 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.284385 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.284617 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.284806 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.284825 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.284853 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.285057 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.285107 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.285160 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.285269 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.285317 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.285343 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.285383 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.285600 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.285640 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.285692 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.285933 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.286085 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.286466 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.286596 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.286668 4781 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.287093 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.287294 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.287573 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.287607 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.287982 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.288942 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.289086 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.289331 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.289496 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.289681 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.289557 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.289975 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.290262 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.290282 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.290370 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.290591 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.290539 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.290736 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.290776 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.290907 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.291582 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.291689 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.291731 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.291940 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.291992 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.292124 4781 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.294618 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.295009 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.295027 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.295341 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.295342 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.295414 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.295485 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.295508 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.295527 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.295615 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.295887 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.295896 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.296029 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.296099 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.296246 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.296319 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.296828 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.296611 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.296633 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.296877 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.297032 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.297051 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.297071 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:03.796915071 +0000 UTC m=+19.948198588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.297080 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.297098 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.296635 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.296738 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.297504 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.297676 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.298022 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.298192 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.298483 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.298510 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.298757 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.303013 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.309139 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.309317 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.310121 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.315160 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.315583 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.315614 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.315865 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.316577 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.316592 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.316605 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.316657 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.316691 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.316653 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.316767 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:03.816749114 +0000 UTC m=+19.968032491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.318034 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.318100 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.318115 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.318219 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:03.818197081 +0000 UTC m=+19.969480468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.318356 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.318457 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:03.818437978 +0000 UTC m=+19.969721365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.329066 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.330024 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.330186 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.330233 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.332134 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.333595 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-kr4pr"] Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.334795 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.337646 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-568jn"] Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.339048 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.339124 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.339213 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.339747 4781 scope.go:117] "RemoveContainer" containerID="3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.339360 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-568jn" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.339214 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.339966 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.340413 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.342493 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.343431 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.344002 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.346844 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.347048 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.347790 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.350344 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.350389 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.358758 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.350431 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.350522 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.350536 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.350557 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.350589 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.350650 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.350699 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.367392 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375439 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375510 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375571 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375583 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375592 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375601 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375611 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375620 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375627 4781 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375635 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375643 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375651 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375659 4781 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375667 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375675 4781 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375683 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375692 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375700 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375708 4781 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375716 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375724 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375732 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375759 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375769 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375780 4781 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375791 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375802 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375812 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375824 4781 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375834 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375844 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375857 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375868 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375879 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375889 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375903 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375957 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375977 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.375988 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376000 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376011 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376021 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376031 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376040 4781 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376048 4781 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376056 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376064 4781 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376072 4781 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376081 4781 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376094 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376155 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376167 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376301 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376331 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376356 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376369 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376381 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376392 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376402 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376414 4781 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376427 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376462 4781 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376481 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376493 4781 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376506 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376517 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376532 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376542 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376552 4781 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376563 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376602 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376614 4781 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376624 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376635 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376646 4781 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376661 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376672 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376683 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376693 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376703 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376716 4781 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376727 4781 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376738 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376747 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376758 4781 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376769 4781 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376779 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376799 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376810 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376820 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376831 4781 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376843 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376856 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376866 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376877 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376888 4781 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376899 4781 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376909 4781 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376964 4781 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376977 4781 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.376989 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377000 4781 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377013 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377024 4781 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377035 4781 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377046 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377056 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377072 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377083 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377093 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377104 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377116 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377134 4781 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377144 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377154 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377164 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377174 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377191 4781 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377202 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377213 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377224 4781 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377235 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377246 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377260 4781 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377271 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377283 4781 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377294 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377305 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377316 4781 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377327 4781 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377337 4781 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377349 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377369 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377381 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377391 4781 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377402 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377413 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377423 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377436 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377448 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377459 4781 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377470 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377481 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377492 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377502 4781 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377513 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377525 4781 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377537 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377547 4781 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377559 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377570 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377581 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377591 4781 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377601 4781 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377612 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377622 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377633 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377645 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377656 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377666 4781 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377676 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377686 4781 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377696 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377707 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377717 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377728 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377739 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377749 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377775 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377788 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377800 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377810 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377821 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377831 4781 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377842 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377852 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377863 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377874 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377884 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377895 4781 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377907 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377933 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377946 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.377958 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.378233 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.378411 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.378938 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.384824 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.387056 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.390667 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.399661 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.408653 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.415533 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.418251 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.421301 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.428223 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.429986 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: W1208 20:05:03.435036 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-26cfa683e097db3df3797f8b76e7b34d4fae427a44bb06d959c30ea672f25c07 WatchSource:0}: Error finding container 26cfa683e097db3df3797f8b76e7b34d4fae427a44bb06d959c30ea672f25c07: Status 404 returned error can't find the container with id 26cfa683e097db3df3797f8b76e7b34d4fae427a44bb06d959c30ea672f25c07 Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.439663 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.453830 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.465000 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.473679 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.479065 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p5kn\" (UniqueName: \"kubernetes.io/projected/7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8-kube-api-access-2p5kn\") pod \"machine-config-daemon-kr4pr\" (UID: \"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\") " pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.479184 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8-proxy-tls\") pod \"machine-config-daemon-kr4pr\" (UID: \"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\") " pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.479288 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8-rootfs\") pod \"machine-config-daemon-kr4pr\" (UID: \"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\") " pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.479373 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93-hosts-file\") pod \"node-resolver-568jn\" (UID: \"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\") " pod="openshift-dns/node-resolver-568jn" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.479501 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8-mcd-auth-proxy-config\") pod \"machine-config-daemon-kr4pr\" (UID: \"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\") " pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.479538 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qphws\" (UniqueName: \"kubernetes.io/projected/8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93-kube-api-access-qphws\") pod \"node-resolver-568jn\" (UID: \"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\") " pod="openshift-dns/node-resolver-568jn" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.479578 4781 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.479597 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.479611 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.482505 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.491095 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.501368 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.519241 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.580118 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93-hosts-file\") pod \"node-resolver-568jn\" (UID: \"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\") " pod="openshift-dns/node-resolver-568jn" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.580196 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8-mcd-auth-proxy-config\") pod \"machine-config-daemon-kr4pr\" (UID: \"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\") " pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.580230 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qphws\" (UniqueName: \"kubernetes.io/projected/8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93-kube-api-access-qphws\") pod \"node-resolver-568jn\" (UID: \"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\") " pod="openshift-dns/node-resolver-568jn" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.580254 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8-proxy-tls\") pod \"machine-config-daemon-kr4pr\" (UID: \"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\") " pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.580275 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p5kn\" (UniqueName: \"kubernetes.io/projected/7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8-kube-api-access-2p5kn\") pod \"machine-config-daemon-kr4pr\" (UID: \"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\") " pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.580309 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8-rootfs\") pod \"machine-config-daemon-kr4pr\" (UID: \"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\") " pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.580374 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8-rootfs\") pod \"machine-config-daemon-kr4pr\" (UID: \"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\") " pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.580441 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93-hosts-file\") pod \"node-resolver-568jn\" (UID: \"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\") " pod="openshift-dns/node-resolver-568jn" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.581179 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8-mcd-auth-proxy-config\") pod \"machine-config-daemon-kr4pr\" (UID: \"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\") " pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.585435 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8-proxy-tls\") pod \"machine-config-daemon-kr4pr\" (UID: \"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\") " pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.601456 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qphws\" (UniqueName: \"kubernetes.io/projected/8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93-kube-api-access-qphws\") pod \"node-resolver-568jn\" (UID: \"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\") " pod="openshift-dns/node-resolver-568jn" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.602589 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p5kn\" (UniqueName: \"kubernetes.io/projected/7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8-kube-api-access-2p5kn\") pod \"machine-config-daemon-kr4pr\" (UID: \"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\") " pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.664486 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-568jn" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.668661 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:05:03 crc kubenswrapper[4781]: W1208 20:05:03.677558 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a03f1d5_f7f0_4b2b_8ae6_e2e7dd397d93.slice/crio-e3467e5d42e757a4ee7aba9ab2f97b092a3a43066ab63f38f7fd2adc575693e9 WatchSource:0}: Error finding container e3467e5d42e757a4ee7aba9ab2f97b092a3a43066ab63f38f7fd2adc575693e9: Status 404 returned error can't find the container with id e3467e5d42e757a4ee7aba9ab2f97b092a3a43066ab63f38f7fd2adc575693e9 Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.694518 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-zqc9l"] Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.695186 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-gr5xw"] Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.695369 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.695378 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-tm5z7"] Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.695477 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.696218 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.696348 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-67t9k"] Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.696533 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.697300 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.700293 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.700484 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.703501 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.703701 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.704129 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.704415 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.704802 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.705002 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.705167 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.705208 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.705267 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.705328 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.705410 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.705553 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.716090 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.730510 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.745123 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.760693 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.776198 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.782033 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.782228 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:05:04.782209235 +0000 UTC m=+20.933492612 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.792179 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.808207 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.822018 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.832254 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.845670 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.867452 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:03Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883278 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883321 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-etc-openvswitch\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883357 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-node-log\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883374 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-ovnkube-script-lib\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883513 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883583 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmskv\" (UniqueName: \"kubernetes.io/projected/c74e396c-5b68-47be-b86b-9f48c02ec760-kube-api-access-tmskv\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883604 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-cnibin\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883640 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-cnibin\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883664 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883690 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-systemd\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883712 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-multus-daemon-config\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883738 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-multus-conf-dir\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883766 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883790 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883808 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-system-cni-dir\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883825 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-systemd-units\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883842 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883864 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-var-lib-cni-multus\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883885 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-slash\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883900 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-ovn\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883959 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-cni-binary-copy\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.883984 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-run-k8s-cni-cncf-io\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884021 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-run-netns\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884050 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-kubelet\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884073 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-os-release\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884101 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-log-socket\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884122 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-system-cni-dir\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884143 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-multus-cni-dir\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884163 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-var-lib-kubelet\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884188 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-etc-kubernetes\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884221 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884244 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-var-lib-cni-bin\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884263 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-hostroot\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884287 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8j4f\" (UniqueName: \"kubernetes.io/projected/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-kube-api-access-c8j4f\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884313 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmtcw\" (UniqueName: \"kubernetes.io/projected/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-kube-api-access-nmtcw\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884337 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-env-overrides\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884363 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-ovnkube-config\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-multus-socket-dir-parent\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884418 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-run-multus-certs\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884444 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-run-netns\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884462 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-openvswitch\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884479 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-cni-bin\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884507 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884528 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-os-release\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884551 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-run-ovn-kubernetes\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884591 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3526d83-eb7e-486e-9357-80df536d09fd-ovn-node-metrics-cert\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884612 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-var-lib-openvswitch\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884629 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-cni-netd\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884648 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dbq4\" (UniqueName: \"kubernetes.io/projected/a3526d83-eb7e-486e-9357-80df536d09fd-kube-api-access-9dbq4\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.884676 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-cni-binary-copy\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.884929 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.884949 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.884962 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.885025 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:04.885006254 +0000 UTC m=+21.036289631 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.885519 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.885560 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:04.885550518 +0000 UTC m=+21.036833895 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.885703 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.885731 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.885745 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.885753 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.885796 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:04.885778864 +0000 UTC m=+21.037062241 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.885817 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:04.885809525 +0000 UTC m=+21.037092902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.886095 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:03Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.899301 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:03Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.915038 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:03Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.931271 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:03Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.945301 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:03Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.956805 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:03Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.968190 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:03Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985683 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-openvswitch\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985722 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-cni-bin\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985739 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-run-netns\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985756 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-run-ovn-kubernetes\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985771 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3526d83-eb7e-486e-9357-80df536d09fd-ovn-node-metrics-cert\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985795 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-os-release\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985808 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-cni-binary-copy\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985825 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-var-lib-openvswitch\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-run-ovn-kubernetes\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985872 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-cni-netd\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985830 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-cni-bin\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985798 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-openvswitch\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985839 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-cni-netd\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985937 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-run-netns\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986052 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dbq4\" (UniqueName: \"kubernetes.io/projected/a3526d83-eb7e-486e-9357-80df536d09fd-kube-api-access-9dbq4\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986080 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-node-log\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.985993 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:03Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986107 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-ovnkube-script-lib\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986085 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-var-lib-openvswitch\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986133 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-os-release\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986246 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-node-log\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986266 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986291 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-etc-openvswitch\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986318 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-cnibin\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986354 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-cnibin\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986358 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmskv\" (UniqueName: \"kubernetes.io/projected/c74e396c-5b68-47be-b86b-9f48c02ec760-kube-api-access-tmskv\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986317 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-etc-openvswitch\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986393 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986410 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-systemd\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986428 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-multus-daemon-config\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986449 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-cnibin\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986475 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-multus-conf-dir\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986494 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-systemd-units\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986513 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986529 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-var-lib-cni-multus\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986544 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986555 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-system-cni-dir\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986576 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-multus-conf-dir\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986591 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-cni-binary-copy\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986578 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-system-cni-dir\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986613 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986622 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-run-k8s-cni-cncf-io\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986633 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-systemd\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986639 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-var-lib-cni-multus\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986646 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-run-netns\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986669 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-cnibin\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986675 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-slash\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986545 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-cni-binary-copy\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986703 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-ovn\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986722 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-run-netns\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986727 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-kubelet\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986748 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-run-k8s-cni-cncf-io\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986753 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-os-release\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986770 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-slash\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986777 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-multus-cni-dir\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986792 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-kubelet\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986801 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-var-lib-kubelet\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986813 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-ovn\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986824 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-etc-kubernetes\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986849 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-log-socket\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986870 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-system-cni-dir\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986873 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-os-release\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986891 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986908 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-etc-kubernetes\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986940 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-var-lib-cni-bin\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986964 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-hostroot\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986983 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-ovnkube-script-lib\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986991 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-multus-cni-dir\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986596 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-systemd-units\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.986989 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8j4f\" (UniqueName: \"kubernetes.io/projected/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-kube-api-access-c8j4f\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987041 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987124 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmtcw\" (UniqueName: \"kubernetes.io/projected/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-kube-api-access-nmtcw\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987131 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-log-socket\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987154 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-env-overrides\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987169 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-var-lib-cni-bin\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.987181 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: E1208 20:05:03.987228 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs podName:c74e396c-5b68-47be-b86b-9f48c02ec760 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:04.487213138 +0000 UTC m=+20.638496595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs") pod "network-metrics-daemon-gr5xw" (UID: "c74e396c-5b68-47be-b86b-9f48c02ec760") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987231 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-multus-daemon-config\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987181 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-hostroot\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987315 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-system-cni-dir\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987319 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-var-lib-kubelet\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987364 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-ovnkube-config\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-cni-binary-copy\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987408 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-multus-socket-dir-parent\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987512 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-multus-socket-dir-parent\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987538 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-run-multus-certs\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987570 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-env-overrides\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987592 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-host-run-multus-certs\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.987830 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-ovnkube-config\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:03 crc kubenswrapper[4781]: I1208 20:05:03.990612 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3526d83-eb7e-486e-9357-80df536d09fd-ovn-node-metrics-cert\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.001375 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:03Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.005329 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8j4f\" (UniqueName: \"kubernetes.io/projected/a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20-kube-api-access-c8j4f\") pod \"multus-tm5z7\" (UID: \"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\") " pod="openshift-multus/multus-tm5z7" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.005409 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dbq4\" (UniqueName: \"kubernetes.io/projected/a3526d83-eb7e-486e-9357-80df536d09fd-kube-api-access-9dbq4\") pod \"ovnkube-node-67t9k\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.007816 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmskv\" (UniqueName: \"kubernetes.io/projected/c74e396c-5b68-47be-b86b-9f48c02ec760-kube-api-access-tmskv\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.008259 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmtcw\" (UniqueName: \"kubernetes.io/projected/a6a36d76-3939-4b05-a4b4-c97c3c03aaf9-kube-api-access-nmtcw\") pod \"multus-additional-cni-plugins-zqc9l\" (UID: \"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\") " pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.015287 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.028996 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.041442 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.050532 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tm5z7" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.051273 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: W1208 20:05:04.059274 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6a36d76_3939_4b05_a4b4_c97c3c03aaf9.slice/crio-2f7c5c0c6a89ddaff94b51f842ebfafd6a48220b1403fa2ef0a8a1d6c42a4850 WatchSource:0}: Error finding container 2f7c5c0c6a89ddaff94b51f842ebfafd6a48220b1403fa2ef0a8a1d6c42a4850: Status 404 returned error can't find the container with id 2f7c5c0c6a89ddaff94b51f842ebfafd6a48220b1403fa2ef0a8a1d6c42a4850 Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.070123 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.080403 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.085304 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: W1208 20:05:04.094002 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3526d83_eb7e_486e_9357_80df536d09fd.slice/crio-5ba994fef7e3867eb1a9ffc1b0a942969dc4cc1f270552ea31887c53bdd46d2a WatchSource:0}: Error finding container 5ba994fef7e3867eb1a9ffc1b0a942969dc4cc1f270552ea31887c53bdd46d2a: Status 404 returned error can't find the container with id 5ba994fef7e3867eb1a9ffc1b0a942969dc4cc1f270552ea31887c53bdd46d2a Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.138533 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.139340 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.140655 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.141572 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.142393 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.142981 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.143678 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.144415 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.148591 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.149364 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.150400 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.150910 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.152006 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.152632 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.154516 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.155073 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.156067 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.156631 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.157072 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.159938 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.160514 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.160964 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.161880 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.162306 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.163279 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.163658 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.166667 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.167003 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.167434 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.167888 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.168858 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.169501 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.170314 4781 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.170412 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.172180 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.174398 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.174822 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.176300 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.177372 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.178476 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.178441 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.179521 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.180820 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.181688 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.182650 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.183764 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.184392 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.185295 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.185848 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.187312 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.188152 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.189088 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.189619 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.190226 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.191238 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.191798 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.192888 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.194101 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.249316 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.249364 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.249373 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"60463b920b181290af65f1b3affc7343f9adc43e700af4482c5d19d647ea6179"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.251785 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"26cfa683e097db3df3797f8b76e7b34d4fae427a44bb06d959c30ea672f25c07"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.257818 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tm5z7" event={"ID":"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20","Type":"ContainerStarted","Data":"f7d45c81b8a7d312536211546177b8d8a5a9c1c3153ecfb5db205432fc8e46d1"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.269553 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.289189 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.289242 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.289254 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"e6979f1bd19309268bad90597ed7843574415c2786dde9881bd14c5377edf93c"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.303414 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.303441 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.303555 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e310a64ed91a1a4f61c86065987a80dffe82931aaca16b6c88252c2e5dbb6e16"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.309837 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.313893 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.314601 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.319269 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" event={"ID":"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9","Type":"ContainerStarted","Data":"2f7c5c0c6a89ddaff94b51f842ebfafd6a48220b1403fa2ef0a8a1d6c42a4850"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.325270 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerStarted","Data":"5ba994fef7e3867eb1a9ffc1b0a942969dc4cc1f270552ea31887c53bdd46d2a"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.328720 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-568jn" event={"ID":"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93","Type":"ContainerStarted","Data":"5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.328757 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-568jn" event={"ID":"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93","Type":"ContainerStarted","Data":"e3467e5d42e757a4ee7aba9ab2f97b092a3a43066ab63f38f7fd2adc575693e9"} Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.330867 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.340948 4781 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.352237 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.374085 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.419766 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.456084 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.494310 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.494595 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.494722 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.494773 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs podName:c74e396c-5b68-47be-b86b-9f48c02ec760 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:05.494757967 +0000 UTC m=+21.646041344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs") pod "network-metrics-daemon-gr5xw" (UID: "c74e396c-5b68-47be-b86b-9f48c02ec760") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.537326 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.577020 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.623764 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.653733 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.706818 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.734935 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.777005 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.796863 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.797128 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:05:06.797100048 +0000 UTC m=+22.948383425 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.841127 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.863131 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.897727 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.897791 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.897813 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.897834 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.897945 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.897989 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.898021 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.898014 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.898027 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:06.898010899 +0000 UTC m=+23.049294276 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.898034 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.898139 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:06.898122652 +0000 UTC m=+23.049406029 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.898169 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:06.898156233 +0000 UTC m=+23.049439610 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.897994 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.898199 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.898209 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:04 crc kubenswrapper[4781]: E1208 20:05:04.898228 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:06.898223074 +0000 UTC m=+23.049506451 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.912909 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.966331 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:04 crc kubenswrapper[4781]: I1208 20:05:04.996684 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.038433 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.065711 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.102430 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.124803 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.124827 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.124860 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.124802 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:05 crc kubenswrapper[4781]: E1208 20:05:05.124966 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:05 crc kubenswrapper[4781]: E1208 20:05:05.125045 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:05 crc kubenswrapper[4781]: E1208 20:05:05.125121 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:05 crc kubenswrapper[4781]: E1208 20:05:05.125180 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.148775 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.332648 4781 generic.go:334] "Generic (PLEG): container finished" podID="a6a36d76-3939-4b05-a4b4-c97c3c03aaf9" containerID="3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf" exitCode=0 Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.332772 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" event={"ID":"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9","Type":"ContainerDied","Data":"3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf"} Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.334651 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3526d83-eb7e-486e-9357-80df536d09fd" containerID="9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f" exitCode=0 Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.334681 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f"} Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.336307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tm5z7" event={"ID":"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20","Type":"ContainerStarted","Data":"4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8"} Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.348265 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.363338 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.377677 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.390321 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.405322 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.417292 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.429270 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.454954 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.491741 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.503978 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:05 crc kubenswrapper[4781]: E1208 20:05:05.504125 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:05 crc kubenswrapper[4781]: E1208 20:05:05.504203 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs podName:c74e396c-5b68-47be-b86b-9f48c02ec760 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:07.50418393 +0000 UTC m=+23.655467307 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs") pod "network-metrics-daemon-gr5xw" (UID: "c74e396c-5b68-47be-b86b-9f48c02ec760") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.533536 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.575704 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.627508 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.654989 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.699486 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.734303 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.779580 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.818981 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.853283 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.892969 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.935532 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:05 crc kubenswrapper[4781]: I1208 20:05:05.974118 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:05Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.014686 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.057970 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.093579 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.113295 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.129529 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.136092 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.162476 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.207742 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.243448 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.274562 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.317167 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.340995 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a"} Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.342486 4781 generic.go:334] "Generic (PLEG): container finished" podID="a6a36d76-3939-4b05-a4b4-c97c3c03aaf9" containerID="1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0" exitCode=0 Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.342528 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" event={"ID":"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9","Type":"ContainerDied","Data":"1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0"} Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.345552 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerStarted","Data":"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6"} Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.345576 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerStarted","Data":"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091"} Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.345584 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerStarted","Data":"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563"} Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.345592 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerStarted","Data":"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc"} Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.345602 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerStarted","Data":"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c"} Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.345609 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerStarted","Data":"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0"} Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.360782 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.371781 4781 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.415259 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.455189 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.494102 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.534291 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.574751 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.620323 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.657893 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.696129 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.736344 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.777100 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.813584 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.813775 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:05:10.813749065 +0000 UTC m=+26.965032482 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.816026 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.854229 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.897959 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.914330 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.914403 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.914452 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.914491 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.914526 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.914543 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.914559 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.914599 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.914610 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.914617 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.914621 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.914572 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:10.914553603 +0000 UTC m=+27.065836990 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.914706 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:10.914682376 +0000 UTC m=+27.065965783 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.914730 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.914738 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:10.914721337 +0000 UTC m=+27.066004754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:06 crc kubenswrapper[4781]: E1208 20:05:06.914844 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:10.914814539 +0000 UTC m=+27.066097957 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.966678 4781 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.966768 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.969211 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.969262 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.969281 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.969402 4781 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 08 20:05:06 crc kubenswrapper[4781]: I1208 20:05:06.993613 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:06Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.006863 4781 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.007223 4781 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.008586 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.008618 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.008630 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.008649 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.008661 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: E1208 20:05:07.018968 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.021930 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.021965 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.021974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.021986 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.021995 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: E1208 20:05:07.033734 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.037088 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.037120 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.037158 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.037174 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.037185 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: E1208 20:05:07.049061 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.052468 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.052498 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.052506 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.052527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.052537 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.059161 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: E1208 20:05:07.069464 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.073069 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.073096 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.073106 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.073120 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.073132 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: E1208 20:05:07.083624 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: E1208 20:05:07.083776 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.085356 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.085384 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.085395 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.085411 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.085422 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.099546 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.124751 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.124790 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:07 crc kubenswrapper[4781]: E1208 20:05:07.124861 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.124912 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.125003 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:07 crc kubenswrapper[4781]: E1208 20:05:07.125193 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:07 crc kubenswrapper[4781]: E1208 20:05:07.125178 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:07 crc kubenswrapper[4781]: E1208 20:05:07.125242 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.135632 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.173996 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.187560 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.187597 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.187607 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.187621 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.187632 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.215716 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.254543 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.289772 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.289817 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.289828 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.289843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.289854 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.295906 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.325273 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jphfq"] Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.325701 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jphfq" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.335039 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.346407 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.349972 4781 generic.go:334] "Generic (PLEG): container finished" podID="a6a36d76-3939-4b05-a4b4-c97c3c03aaf9" containerID="0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151" exitCode=0 Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.350034 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" event={"ID":"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9","Type":"ContainerDied","Data":"0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151"} Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.366813 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.386523 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.392161 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.392192 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.392201 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.392214 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.392222 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.406603 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.418144 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e290f4ba-e014-443f-bbaa-1eeb23a9bd15-host\") pod \"node-ca-jphfq\" (UID: \"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\") " pod="openshift-image-registry/node-ca-jphfq" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.418220 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9htfg\" (UniqueName: \"kubernetes.io/projected/e290f4ba-e014-443f-bbaa-1eeb23a9bd15-kube-api-access-9htfg\") pod \"node-ca-jphfq\" (UID: \"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\") " pod="openshift-image-registry/node-ca-jphfq" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.418363 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e290f4ba-e014-443f-bbaa-1eeb23a9bd15-serviceca\") pod \"node-ca-jphfq\" (UID: \"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\") " pod="openshift-image-registry/node-ca-jphfq" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.455132 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.495067 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.495100 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.495109 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.495124 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.495136 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.495546 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.519203 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e290f4ba-e014-443f-bbaa-1eeb23a9bd15-host\") pod \"node-ca-jphfq\" (UID: \"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\") " pod="openshift-image-registry/node-ca-jphfq" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.519247 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.519265 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9htfg\" (UniqueName: \"kubernetes.io/projected/e290f4ba-e014-443f-bbaa-1eeb23a9bd15-kube-api-access-9htfg\") pod \"node-ca-jphfq\" (UID: \"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\") " pod="openshift-image-registry/node-ca-jphfq" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.519281 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e290f4ba-e014-443f-bbaa-1eeb23a9bd15-serviceca\") pod \"node-ca-jphfq\" (UID: \"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\") " pod="openshift-image-registry/node-ca-jphfq" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.519287 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e290f4ba-e014-443f-bbaa-1eeb23a9bd15-host\") pod \"node-ca-jphfq\" (UID: \"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\") " pod="openshift-image-registry/node-ca-jphfq" Dec 08 20:05:07 crc kubenswrapper[4781]: E1208 20:05:07.519445 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:07 crc kubenswrapper[4781]: E1208 20:05:07.519545 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs podName:c74e396c-5b68-47be-b86b-9f48c02ec760 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:11.519519212 +0000 UTC m=+27.670802649 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs") pod "network-metrics-daemon-gr5xw" (UID: "c74e396c-5b68-47be-b86b-9f48c02ec760") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.520866 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e290f4ba-e014-443f-bbaa-1eeb23a9bd15-serviceca\") pod \"node-ca-jphfq\" (UID: \"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\") " pod="openshift-image-registry/node-ca-jphfq" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.535694 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.566413 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9htfg\" (UniqueName: \"kubernetes.io/projected/e290f4ba-e014-443f-bbaa-1eeb23a9bd15-kube-api-access-9htfg\") pod \"node-ca-jphfq\" (UID: \"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\") " pod="openshift-image-registry/node-ca-jphfq" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.593665 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.598149 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.598190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.598201 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.598218 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.598231 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.633383 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.636496 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jphfq" Dec 08 20:05:07 crc kubenswrapper[4781]: W1208 20:05:07.648228 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode290f4ba_e014_443f_bbaa_1eeb23a9bd15.slice/crio-45d43f5a9a9cc17650a997e8c4baf7fe1f0bad3380770216843f8af05f0d89f9 WatchSource:0}: Error finding container 45d43f5a9a9cc17650a997e8c4baf7fe1f0bad3380770216843f8af05f0d89f9: Status 404 returned error can't find the container with id 45d43f5a9a9cc17650a997e8c4baf7fe1f0bad3380770216843f8af05f0d89f9 Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.678117 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.700821 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.700860 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.700869 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.700884 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.700894 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.716774 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.757531 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.793129 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.803366 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.803398 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.803409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.803425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.803434 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.834731 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.874515 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.905399 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.905439 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.905449 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.905465 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.905478 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:07Z","lastTransitionTime":"2025-12-08T20:05:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.914731 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.957849 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:07 crc kubenswrapper[4781]: I1208 20:05:07.993864 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:07Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.008036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.008074 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.008083 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.008099 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.008109 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:08Z","lastTransitionTime":"2025-12-08T20:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.075178 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.089566 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.109722 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.109760 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.109770 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.109785 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.109796 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:08Z","lastTransitionTime":"2025-12-08T20:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.113189 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.159980 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.195151 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.212410 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.212577 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.212634 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.212691 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.212766 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:08Z","lastTransitionTime":"2025-12-08T20:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.233316 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.274768 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.315312 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.315380 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.315392 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.315409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.315422 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:08Z","lastTransitionTime":"2025-12-08T20:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.355006 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jphfq" event={"ID":"e290f4ba-e014-443f-bbaa-1eeb23a9bd15","Type":"ContainerStarted","Data":"10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f"} Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.355067 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jphfq" event={"ID":"e290f4ba-e014-443f-bbaa-1eeb23a9bd15","Type":"ContainerStarted","Data":"45d43f5a9a9cc17650a997e8c4baf7fe1f0bad3380770216843f8af05f0d89f9"} Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.358632 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerStarted","Data":"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4"} Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.361300 4781 generic.go:334] "Generic (PLEG): container finished" podID="a6a36d76-3939-4b05-a4b4-c97c3c03aaf9" containerID="5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09" exitCode=0 Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.361341 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" event={"ID":"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9","Type":"ContainerDied","Data":"5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09"} Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.373937 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.389698 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.404350 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.417266 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.417298 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.417307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.417323 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.417333 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:08Z","lastTransitionTime":"2025-12-08T20:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.433467 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.480068 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.514215 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.519554 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.519580 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.519589 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.519601 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.519610 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:08Z","lastTransitionTime":"2025-12-08T20:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.552427 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.595753 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.621614 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.621649 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.621662 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.621679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.621691 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:08Z","lastTransitionTime":"2025-12-08T20:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.634253 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.680656 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.715111 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.725791 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.726327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.726341 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.726381 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.726396 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:08Z","lastTransitionTime":"2025-12-08T20:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.756228 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.797251 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.828338 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.828371 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.828381 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.828396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.828406 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:08Z","lastTransitionTime":"2025-12-08T20:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.835495 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.874243 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.930279 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.930331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.930347 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.930407 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:08 crc kubenswrapper[4781]: I1208 20:05:08.930438 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:08Z","lastTransitionTime":"2025-12-08T20:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.005898 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:08Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.025789 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.033802 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.033830 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.033837 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.033851 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.033861 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:09Z","lastTransitionTime":"2025-12-08T20:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.039927 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.054510 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.075891 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.112970 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.125508 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.125551 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:09 crc kubenswrapper[4781]: E1208 20:05:09.125628 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.125655 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:09 crc kubenswrapper[4781]: E1208 20:05:09.125782 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.125899 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:09 crc kubenswrapper[4781]: E1208 20:05:09.125987 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:09 crc kubenswrapper[4781]: E1208 20:05:09.126046 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.135887 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.135952 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.135962 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.135975 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.135985 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:09Z","lastTransitionTime":"2025-12-08T20:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.167422 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.195021 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.235620 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.239345 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.239723 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.239801 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.239865 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.239933 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:09Z","lastTransitionTime":"2025-12-08T20:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.276675 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.314255 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.341884 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.341940 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.341952 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.341967 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.341977 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:09Z","lastTransitionTime":"2025-12-08T20:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.353137 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.368378 4781 generic.go:334] "Generic (PLEG): container finished" podID="a6a36d76-3939-4b05-a4b4-c97c3c03aaf9" containerID="8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be" exitCode=0 Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.368422 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" event={"ID":"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9","Type":"ContainerDied","Data":"8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be"} Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.400425 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.434007 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.444077 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.444114 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.444126 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.444146 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.444159 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:09Z","lastTransitionTime":"2025-12-08T20:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.473472 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.517275 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.547039 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.547079 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.547092 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.547108 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.547120 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:09Z","lastTransitionTime":"2025-12-08T20:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.554689 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.592934 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.633589 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.649108 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.649136 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.649145 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.649157 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.649166 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:09Z","lastTransitionTime":"2025-12-08T20:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.679580 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.714738 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.751939 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.751965 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.751974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.751986 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.751996 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:09Z","lastTransitionTime":"2025-12-08T20:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.765517 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.792752 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.833272 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.853490 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.853522 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.853533 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.853547 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.853557 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:09Z","lastTransitionTime":"2025-12-08T20:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.877105 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.913616 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.956018 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.956060 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.956072 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.956088 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.956100 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:09Z","lastTransitionTime":"2025-12-08T20:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.956530 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:09 crc kubenswrapper[4781]: I1208 20:05:09.993783 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:09Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.034407 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.058340 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.058404 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.058423 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.058447 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.058467 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:10Z","lastTransitionTime":"2025-12-08T20:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.077818 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.120190 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.153605 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.163804 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.163840 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.163850 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.163863 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.163872 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:10Z","lastTransitionTime":"2025-12-08T20:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.194684 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.266005 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.266335 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.266346 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.266360 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.266370 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:10Z","lastTransitionTime":"2025-12-08T20:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.368394 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.368628 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.368739 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.368817 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.368941 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:10Z","lastTransitionTime":"2025-12-08T20:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.375262 4781 generic.go:334] "Generic (PLEG): container finished" podID="a6a36d76-3939-4b05-a4b4-c97c3c03aaf9" containerID="924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa" exitCode=0 Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.375315 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" event={"ID":"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9","Type":"ContainerDied","Data":"924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa"} Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.388273 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.401179 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.416284 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.430383 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.445344 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.456364 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.471087 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.471130 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.471141 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.471154 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.471164 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:10Z","lastTransitionTime":"2025-12-08T20:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.473684 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.518731 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.554140 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.592506 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.592555 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.592566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.592582 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.592593 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:10Z","lastTransitionTime":"2025-12-08T20:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.607331 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.635604 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.673155 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.694667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.694700 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.694709 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.694723 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.694732 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:10Z","lastTransitionTime":"2025-12-08T20:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.721513 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.756449 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.794942 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.796639 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.796664 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.796677 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.796693 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.796702 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:10Z","lastTransitionTime":"2025-12-08T20:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.834166 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:10Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.850589 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:10 crc kubenswrapper[4781]: E1208 20:05:10.850908 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:05:18.850891089 +0000 UTC m=+35.002174466 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.899062 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.899101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.899113 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.899128 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.899138 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:10Z","lastTransitionTime":"2025-12-08T20:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.951949 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.952012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.952042 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:10 crc kubenswrapper[4781]: I1208 20:05:10.952069 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:10 crc kubenswrapper[4781]: E1208 20:05:10.952143 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:10 crc kubenswrapper[4781]: E1208 20:05:10.952161 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:10 crc kubenswrapper[4781]: E1208 20:05:10.952171 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:10 crc kubenswrapper[4781]: E1208 20:05:10.952191 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:10 crc kubenswrapper[4781]: E1208 20:05:10.952224 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:10 crc kubenswrapper[4781]: E1208 20:05:10.952245 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:10 crc kubenswrapper[4781]: E1208 20:05:10.952258 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:10 crc kubenswrapper[4781]: E1208 20:05:10.952189 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:10 crc kubenswrapper[4781]: E1208 20:05:10.952213 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:18.952199299 +0000 UTC m=+35.103482676 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:10 crc kubenswrapper[4781]: E1208 20:05:10.952350 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:18.952326323 +0000 UTC m=+35.103609740 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:10 crc kubenswrapper[4781]: E1208 20:05:10.952386 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:18.952368674 +0000 UTC m=+35.103652171 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:10 crc kubenswrapper[4781]: E1208 20:05:10.952413 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:18.952398645 +0000 UTC m=+35.103682132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.001572 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.001613 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.001624 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.001639 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.001651 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:11Z","lastTransitionTime":"2025-12-08T20:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.103268 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.103313 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.103329 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.103349 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.103364 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:11Z","lastTransitionTime":"2025-12-08T20:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.125345 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.125360 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.125415 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.125483 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:11 crc kubenswrapper[4781]: E1208 20:05:11.125669 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:11 crc kubenswrapper[4781]: E1208 20:05:11.125808 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:11 crc kubenswrapper[4781]: E1208 20:05:11.125970 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:11 crc kubenswrapper[4781]: E1208 20:05:11.126087 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.205791 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.205845 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.205859 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.205886 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.205901 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:11Z","lastTransitionTime":"2025-12-08T20:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.308637 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.308707 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.308721 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.308738 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.308749 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:11Z","lastTransitionTime":"2025-12-08T20:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.381728 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerStarted","Data":"78248307630e06c6d814b026554f55caee80f2833a25b252d56f741d387acf74"} Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.381925 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.385184 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" event={"ID":"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9","Type":"ContainerStarted","Data":"da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e"} Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.393333 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.437704 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.437743 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.437753 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.437770 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.437779 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:11Z","lastTransitionTime":"2025-12-08T20:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.441394 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.445647 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.453315 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.465370 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.478218 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.495497 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.508324 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.517795 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.529999 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.540383 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.540426 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.540436 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.540450 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.540460 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:11Z","lastTransitionTime":"2025-12-08T20:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.541941 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.555478 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.560231 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:11 crc kubenswrapper[4781]: E1208 20:05:11.560378 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:11 crc kubenswrapper[4781]: E1208 20:05:11.560419 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs podName:c74e396c-5b68-47be-b86b-9f48c02ec760 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:19.560406483 +0000 UTC m=+35.711689850 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs") pod "network-metrics-daemon-gr5xw" (UID: "c74e396c-5b68-47be-b86b-9f48c02ec760") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.567889 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.579549 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.591454 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.602301 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.627483 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78248307630e06c6d814b026554f55caee80f2833a25b252d56f741d387acf74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.639277 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.642422 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.642466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.642477 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.642493 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.642504 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:11Z","lastTransitionTime":"2025-12-08T20:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.658318 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78248307630e06c6d814b026554f55caee80f2833a25b252d56f741d387acf74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.669284 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.681126 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.693809 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.712444 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.744517 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.744552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.744561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.744574 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.744582 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:11Z","lastTransitionTime":"2025-12-08T20:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.761350 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.795287 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.835513 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.846302 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.846337 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.846346 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.846358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.846367 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:11Z","lastTransitionTime":"2025-12-08T20:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.873346 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.915480 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.948414 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.948456 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.948467 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.948482 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.948491 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:11Z","lastTransitionTime":"2025-12-08T20:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.955358 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:11 crc kubenswrapper[4781]: I1208 20:05:11.992568 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:11Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.032847 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.051159 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.051206 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.051218 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.051236 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.051249 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:12Z","lastTransitionTime":"2025-12-08T20:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.075722 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.114584 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.152822 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.152847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.152856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.152868 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.152880 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:12Z","lastTransitionTime":"2025-12-08T20:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.255573 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.255621 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.255635 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.255661 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.255677 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:12Z","lastTransitionTime":"2025-12-08T20:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.358019 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.358069 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.358083 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.358101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.358114 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:12Z","lastTransitionTime":"2025-12-08T20:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.387796 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.388969 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.413838 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.430793 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.449035 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78248307630e06c6d814b026554f55caee80f2833a25b252d56f741d387acf74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.460535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.460594 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.460606 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.460641 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.460652 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:12Z","lastTransitionTime":"2025-12-08T20:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.462661 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.475329 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.494404 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.505677 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.523443 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.534566 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.543331 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.553136 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.563327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.563360 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.563369 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.563381 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.563390 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:12Z","lastTransitionTime":"2025-12-08T20:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.566180 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.594695 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.632846 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.666016 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.666307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.666398 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.666482 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.666560 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:12Z","lastTransitionTime":"2025-12-08T20:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.675178 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.717409 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.754221 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:12Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.769331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.769532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.769599 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.769666 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.769725 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:12Z","lastTransitionTime":"2025-12-08T20:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.871206 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.871405 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.871464 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.871521 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.871620 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:12Z","lastTransitionTime":"2025-12-08T20:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.974556 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.974623 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.974642 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.974668 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:12 crc kubenswrapper[4781]: I1208 20:05:12.974688 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:12Z","lastTransitionTime":"2025-12-08T20:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.076984 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.077025 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.077035 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.077049 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.077060 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:13Z","lastTransitionTime":"2025-12-08T20:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.125842 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.125958 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.125968 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.126280 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:13 crc kubenswrapper[4781]: E1208 20:05:13.126463 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:13 crc kubenswrapper[4781]: E1208 20:05:13.126561 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:13 crc kubenswrapper[4781]: E1208 20:05:13.126629 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:13 crc kubenswrapper[4781]: E1208 20:05:13.126675 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.179766 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.179822 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.179834 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.179852 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.179865 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:13Z","lastTransitionTime":"2025-12-08T20:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.283179 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.283244 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.283271 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.283298 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.283318 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:13Z","lastTransitionTime":"2025-12-08T20:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.385856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.385900 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.385911 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.385944 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.385957 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:13Z","lastTransitionTime":"2025-12-08T20:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.394992 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/0.log" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.398129 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3526d83-eb7e-486e-9357-80df536d09fd" containerID="78248307630e06c6d814b026554f55caee80f2833a25b252d56f741d387acf74" exitCode=1 Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.398191 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"78248307630e06c6d814b026554f55caee80f2833a25b252d56f741d387acf74"} Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.398748 4781 scope.go:117] "RemoveContainer" containerID="78248307630e06c6d814b026554f55caee80f2833a25b252d56f741d387acf74" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.414369 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.426552 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.435696 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.457023 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.470115 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.484795 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.488703 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.488754 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.488766 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.488780 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.488789 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:13Z","lastTransitionTime":"2025-12-08T20:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.509296 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78248307630e06c6d814b026554f55caee80f2833a25b252d56f741d387acf74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78248307630e06c6d814b026554f55caee80f2833a25b252d56f741d387acf74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:13Z\\\",\\\"message\\\":\\\"kg/client/informers/externalversions/factory.go:117\\\\nI1208 20:05:12.797991 6063 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.798093 6063 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.798223 6063 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.798694 6063 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.798837 6063 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.799116 6063 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.523849 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.541403 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.561111 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.570478 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.589481 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.591068 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.591097 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.591105 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.591120 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.591128 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:13Z","lastTransitionTime":"2025-12-08T20:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.600903 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.610272 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.622789 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.634604 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:13Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.694046 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.694088 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.694101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.694119 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.694134 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:13Z","lastTransitionTime":"2025-12-08T20:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.796410 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.796452 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.796464 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.796482 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.796494 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:13Z","lastTransitionTime":"2025-12-08T20:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.861648 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.898509 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.898545 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.898555 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.898575 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:13 crc kubenswrapper[4781]: I1208 20:05:13.898589 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:13Z","lastTransitionTime":"2025-12-08T20:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.000875 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.000936 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.000947 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.000963 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.000973 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:14Z","lastTransitionTime":"2025-12-08T20:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.102895 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.102959 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.102976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.102992 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.103002 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:14Z","lastTransitionTime":"2025-12-08T20:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.137394 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.149333 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.159616 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.172980 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.185441 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.197448 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.205438 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.205464 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.205473 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.205485 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.205494 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:14Z","lastTransitionTime":"2025-12-08T20:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.208099 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.216961 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.227652 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.239326 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.264259 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78248307630e06c6d814b026554f55caee80f2833a25b252d56f741d387acf74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78248307630e06c6d814b026554f55caee80f2833a25b252d56f741d387acf74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:13Z\\\",\\\"message\\\":\\\"kg/client/informers/externalversions/factory.go:117\\\\nI1208 20:05:12.797991 6063 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.798093 6063 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.798223 6063 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.798694 6063 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.798837 6063 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.799116 6063 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.282233 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.299445 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.307193 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.307397 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.307507 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.307585 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.307645 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:14Z","lastTransitionTime":"2025-12-08T20:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.310815 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.326499 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.338806 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.402165 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/1.log" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.402693 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/0.log" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.405021 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3526d83-eb7e-486e-9357-80df536d09fd" containerID="1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406" exitCode=1 Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.405056 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406"} Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.405101 4781 scope.go:117] "RemoveContainer" containerID="78248307630e06c6d814b026554f55caee80f2833a25b252d56f741d387acf74" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.405676 4781 scope.go:117] "RemoveContainer" containerID="1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406" Dec 08 20:05:14 crc kubenswrapper[4781]: E1208 20:05:14.406213 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.409582 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.409620 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.409631 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.409647 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.409659 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:14Z","lastTransitionTime":"2025-12-08T20:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.419761 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.431377 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.442835 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.454691 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.464111 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.477722 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.491130 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.509532 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78248307630e06c6d814b026554f55caee80f2833a25b252d56f741d387acf74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:13Z\\\",\\\"message\\\":\\\"kg/client/informers/externalversions/factory.go:117\\\\nI1208 20:05:12.797991 6063 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.798093 6063 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.798223 6063 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.798694 6063 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.798837 6063 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:12.799116 6063 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:14Z\\\",\\\"message\\\":\\\"ddr{IP:\\\\\\\"10.217.5.219\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1208 20:05:14.126594 6193 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1208 20:05:14.126996 6193 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z]\\\\nI1208 20:05:14.126991 6193 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1208 20:05:14.127017 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.511764 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.511791 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.511798 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.511810 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.511819 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:14Z","lastTransitionTime":"2025-12-08T20:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.533068 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.546303 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.560563 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.576472 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.589601 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.601612 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.614413 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.614445 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.614453 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.614467 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.614478 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:14Z","lastTransitionTime":"2025-12-08T20:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.640109 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.673480 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.717160 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.717202 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.717212 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.717230 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.717242 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:14Z","lastTransitionTime":"2025-12-08T20:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.820307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.820349 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.820371 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.820390 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.820402 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:14Z","lastTransitionTime":"2025-12-08T20:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.923448 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.923513 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.923524 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.923539 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:14 crc kubenswrapper[4781]: I1208 20:05:14.923550 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:14Z","lastTransitionTime":"2025-12-08T20:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.025820 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.026104 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.026188 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.026292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.026380 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:15Z","lastTransitionTime":"2025-12-08T20:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.125953 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.125948 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.125986 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:15 crc kubenswrapper[4781]: E1208 20:05:15.126514 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:15 crc kubenswrapper[4781]: E1208 20:05:15.126315 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.126016 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:15 crc kubenswrapper[4781]: E1208 20:05:15.126751 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:15 crc kubenswrapper[4781]: E1208 20:05:15.126640 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.128641 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.128675 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.128686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.128700 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.128713 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:15Z","lastTransitionTime":"2025-12-08T20:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.231653 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.231690 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.231701 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.231717 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.231728 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:15Z","lastTransitionTime":"2025-12-08T20:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.334502 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.334552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.334566 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.334585 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.334597 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:15Z","lastTransitionTime":"2025-12-08T20:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.410624 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/1.log" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.414023 4781 scope.go:117] "RemoveContainer" containerID="1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406" Dec 08 20:05:15 crc kubenswrapper[4781]: E1208 20:05:15.414168 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.426387 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.437152 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.437176 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.437186 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.437199 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.437214 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:15Z","lastTransitionTime":"2025-12-08T20:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.439530 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.458213 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:14Z\\\",\\\"message\\\":\\\"ddr{IP:\\\\\\\"10.217.5.219\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1208 20:05:14.126594 6193 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1208 20:05:14.126996 6193 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z]\\\\nI1208 20:05:14.126991 6193 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1208 20:05:14.127017 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.476006 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.485709 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.497043 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.508794 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.517230 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.527117 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.538722 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.538990 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.539025 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.539037 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.539055 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.539065 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:15Z","lastTransitionTime":"2025-12-08T20:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.547856 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.557190 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.572217 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.598190 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.626813 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.639237 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:15Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.640815 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.640844 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.640853 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.640871 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.640881 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:15Z","lastTransitionTime":"2025-12-08T20:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.743581 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.743967 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.744105 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.744256 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.744379 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:15Z","lastTransitionTime":"2025-12-08T20:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.861289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.861352 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.861391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.861421 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.861443 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:15Z","lastTransitionTime":"2025-12-08T20:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.964750 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.964801 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.964817 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.964836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:15 crc kubenswrapper[4781]: I1208 20:05:15.964850 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:15Z","lastTransitionTime":"2025-12-08T20:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.067664 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.067715 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.067730 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.067748 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.067764 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:16Z","lastTransitionTime":"2025-12-08T20:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.086513 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.098012 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.108939 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.122477 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.136809 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.147559 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.158636 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.170802 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.170842 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.170853 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.170869 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.170881 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:16Z","lastTransitionTime":"2025-12-08T20:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.171436 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.188477 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:14Z\\\",\\\"message\\\":\\\"ddr{IP:\\\\\\\"10.217.5.219\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1208 20:05:14.126594 6193 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1208 20:05:14.126996 6193 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z]\\\\nI1208 20:05:14.126991 6193 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1208 20:05:14.127017 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.198895 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.215536 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.226122 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.237560 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.250946 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.262021 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.273965 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.274004 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.274014 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.274029 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.274039 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:16Z","lastTransitionTime":"2025-12-08T20:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.275508 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.286161 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.376495 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.376538 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.376547 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.376562 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.376571 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:16Z","lastTransitionTime":"2025-12-08T20:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.478339 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.478370 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.478378 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.478390 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.478399 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:16Z","lastTransitionTime":"2025-12-08T20:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.580449 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.580482 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.580490 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.580502 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.580510 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:16Z","lastTransitionTime":"2025-12-08T20:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.683264 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.683297 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.683332 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.683345 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.683353 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:16Z","lastTransitionTime":"2025-12-08T20:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.786397 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.786446 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.786457 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.786474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.786487 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:16Z","lastTransitionTime":"2025-12-08T20:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.830135 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq"] Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.830720 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.833125 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.833427 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.848195 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.861481 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.875187 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.888779 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.888823 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.888835 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.888850 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.888862 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:16Z","lastTransitionTime":"2025-12-08T20:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.890659 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.901818 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.913941 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35f5c146-0078-4ac0-9b34-b5ae446e4e35-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vbcgq\" (UID: \"35f5c146-0078-4ac0-9b34-b5ae446e4e35\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.913980 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35f5c146-0078-4ac0-9b34-b5ae446e4e35-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vbcgq\" (UID: \"35f5c146-0078-4ac0-9b34-b5ae446e4e35\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.914011 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vljx6\" (UniqueName: \"kubernetes.io/projected/35f5c146-0078-4ac0-9b34-b5ae446e4e35-kube-api-access-vljx6\") pod \"ovnkube-control-plane-749d76644c-vbcgq\" (UID: \"35f5c146-0078-4ac0-9b34-b5ae446e4e35\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.914110 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35f5c146-0078-4ac0-9b34-b5ae446e4e35-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vbcgq\" (UID: \"35f5c146-0078-4ac0-9b34-b5ae446e4e35\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.914616 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.930723 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.951370 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:14Z\\\",\\\"message\\\":\\\"ddr{IP:\\\\\\\"10.217.5.219\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1208 20:05:14.126594 6193 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1208 20:05:14.126996 6193 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z]\\\\nI1208 20:05:14.126991 6193 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1208 20:05:14.127017 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.980910 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.991273 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.991313 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.991322 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.991336 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.991344 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:16Z","lastTransitionTime":"2025-12-08T20:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:16 crc kubenswrapper[4781]: I1208 20:05:16.992591 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:16Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.006221 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:17Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.015092 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35f5c146-0078-4ac0-9b34-b5ae446e4e35-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vbcgq\" (UID: \"35f5c146-0078-4ac0-9b34-b5ae446e4e35\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.015165 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35f5c146-0078-4ac0-9b34-b5ae446e4e35-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vbcgq\" (UID: \"35f5c146-0078-4ac0-9b34-b5ae446e4e35\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.015191 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35f5c146-0078-4ac0-9b34-b5ae446e4e35-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vbcgq\" (UID: \"35f5c146-0078-4ac0-9b34-b5ae446e4e35\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.015231 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vljx6\" (UniqueName: \"kubernetes.io/projected/35f5c146-0078-4ac0-9b34-b5ae446e4e35-kube-api-access-vljx6\") pod \"ovnkube-control-plane-749d76644c-vbcgq\" (UID: \"35f5c146-0078-4ac0-9b34-b5ae446e4e35\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.015710 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35f5c146-0078-4ac0-9b34-b5ae446e4e35-env-overrides\") pod \"ovnkube-control-plane-749d76644c-vbcgq\" (UID: \"35f5c146-0078-4ac0-9b34-b5ae446e4e35\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.016288 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35f5c146-0078-4ac0-9b34-b5ae446e4e35-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-vbcgq\" (UID: \"35f5c146-0078-4ac0-9b34-b5ae446e4e35\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.022330 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35f5c146-0078-4ac0-9b34-b5ae446e4e35-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-vbcgq\" (UID: \"35f5c146-0078-4ac0-9b34-b5ae446e4e35\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.026181 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:17Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.036763 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vljx6\" (UniqueName: \"kubernetes.io/projected/35f5c146-0078-4ac0-9b34-b5ae446e4e35-kube-api-access-vljx6\") pod \"ovnkube-control-plane-749d76644c-vbcgq\" (UID: \"35f5c146-0078-4ac0-9b34-b5ae446e4e35\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.039821 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:17Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.049673 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:17Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.059293 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:17Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.069901 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:17Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.078042 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:17Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.093639 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.093695 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.093719 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.093759 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.093776 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.125380 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.125436 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.125389 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:17 crc kubenswrapper[4781]: E1208 20:05:17.125515 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.125529 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:17 crc kubenswrapper[4781]: E1208 20:05:17.125626 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:17 crc kubenswrapper[4781]: E1208 20:05:17.125692 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:17 crc kubenswrapper[4781]: E1208 20:05:17.125743 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.152707 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.196050 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.196259 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.196451 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.196565 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.196647 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.299259 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.299304 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.299317 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.299335 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.299348 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.403332 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.403380 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.403391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.403408 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.403419 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.404570 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.404603 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.404612 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.404626 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.404636 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:17 crc kubenswrapper[4781]: E1208 20:05:17.421071 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:17Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.422654 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" event={"ID":"35f5c146-0078-4ac0-9b34-b5ae446e4e35","Type":"ContainerStarted","Data":"054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24"} Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.422737 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" event={"ID":"35f5c146-0078-4ac0-9b34-b5ae446e4e35","Type":"ContainerStarted","Data":"caebd2a775df0da7b3a4707989aad0dbd7b5fb6ae91d8e30925f78579dd4368c"} Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.425721 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.425772 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.425784 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.425802 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.425812 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:17 crc kubenswrapper[4781]: E1208 20:05:17.443166 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:17Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.446946 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.446982 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.446993 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.447008 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.447018 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:17 crc kubenswrapper[4781]: E1208 20:05:17.458361 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:17Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.462472 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.462510 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.462519 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.462533 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.462542 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:17 crc kubenswrapper[4781]: E1208 20:05:17.474824 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:17Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.478800 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.478834 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.478842 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.478856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.478865 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:17 crc kubenswrapper[4781]: E1208 20:05:17.491720 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:17Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:17 crc kubenswrapper[4781]: E1208 20:05:17.491841 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.505708 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.505752 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.505764 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.505787 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.505806 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.607870 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.607905 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.607939 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.607956 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.607968 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.709854 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.709881 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.709888 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.709900 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.709908 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.812027 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.812099 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.812125 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.812155 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.812180 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.914979 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.915053 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.915076 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.915102 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:17 crc kubenswrapper[4781]: I1208 20:05:17.915119 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:17Z","lastTransitionTime":"2025-12-08T20:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.017518 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.017553 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.017562 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.017576 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.017585 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:18Z","lastTransitionTime":"2025-12-08T20:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.120363 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.120396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.120405 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.120421 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.120430 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:18Z","lastTransitionTime":"2025-12-08T20:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.223309 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.223343 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.223353 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.223367 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.223376 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:18Z","lastTransitionTime":"2025-12-08T20:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.325477 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.325510 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.325518 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.325532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.325540 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:18Z","lastTransitionTime":"2025-12-08T20:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.426795 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.426820 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.426828 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.426840 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.426848 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:18Z","lastTransitionTime":"2025-12-08T20:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.428343 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" event={"ID":"35f5c146-0078-4ac0-9b34-b5ae446e4e35","Type":"ContainerStarted","Data":"f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743"} Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.450571 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.461561 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.477879 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.496244 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.507547 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.520309 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.529016 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.529067 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.529079 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.529099 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.529109 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:18Z","lastTransitionTime":"2025-12-08T20:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.536169 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.552351 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.569736 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.584402 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.603569 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.617784 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.631656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.631693 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.631705 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.631720 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.631735 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:18Z","lastTransitionTime":"2025-12-08T20:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.632293 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.646734 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.663390 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.675227 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.693154 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:14Z\\\",\\\"message\\\":\\\"ddr{IP:\\\\\\\"10.217.5.219\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1208 20:05:14.126594 6193 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1208 20:05:14.126996 6193 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z]\\\\nI1208 20:05:14.126991 6193 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1208 20:05:14.127017 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:18Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.734127 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.734162 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.734172 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.734188 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.734201 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:18Z","lastTransitionTime":"2025-12-08T20:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.836902 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.836954 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.836963 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.836978 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.836989 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:18Z","lastTransitionTime":"2025-12-08T20:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.936725 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:18 crc kubenswrapper[4781]: E1208 20:05:18.937093 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:05:34.937058234 +0000 UTC m=+51.088341641 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.939209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.939248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.939266 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.939289 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:18 crc kubenswrapper[4781]: I1208 20:05:18.939311 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:18Z","lastTransitionTime":"2025-12-08T20:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.038184 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.038247 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.038276 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.038303 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.038378 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.038413 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.038436 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.038446 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.038458 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.038467 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.038472 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.038484 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:35.038459707 +0000 UTC m=+51.189743114 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.038513 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:35.038497678 +0000 UTC m=+51.189781075 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.038519 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.038539 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:35.038526508 +0000 UTC m=+51.189809995 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.038648 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:35.038621901 +0000 UTC m=+51.189905328 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.042393 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.042438 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.042450 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.042468 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.042481 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:19Z","lastTransitionTime":"2025-12-08T20:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.125777 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.125778 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.125997 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.126102 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.126178 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.126283 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.125809 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.126648 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.145404 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.145459 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.145475 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.145496 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.145512 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:19Z","lastTransitionTime":"2025-12-08T20:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.248808 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.248884 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.248901 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.248961 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.248983 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:19Z","lastTransitionTime":"2025-12-08T20:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.352113 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.352471 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.352614 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.352764 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.352903 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:19Z","lastTransitionTime":"2025-12-08T20:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.455956 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.455990 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.456001 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.456016 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.456028 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:19Z","lastTransitionTime":"2025-12-08T20:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.559055 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.559118 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.559135 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.559160 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.559177 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:19Z","lastTransitionTime":"2025-12-08T20:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.644578 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.644770 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:19 crc kubenswrapper[4781]: E1208 20:05:19.644894 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs podName:c74e396c-5b68-47be-b86b-9f48c02ec760 nodeName:}" failed. No retries permitted until 2025-12-08 20:05:35.644863763 +0000 UTC m=+51.796147170 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs") pod "network-metrics-daemon-gr5xw" (UID: "c74e396c-5b68-47be-b86b-9f48c02ec760") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.662412 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.662476 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.662495 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.662565 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.662584 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:19Z","lastTransitionTime":"2025-12-08T20:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.764731 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.764967 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.765104 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.765183 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.765260 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:19Z","lastTransitionTime":"2025-12-08T20:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.869579 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.869660 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.869686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.869717 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.869739 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:19Z","lastTransitionTime":"2025-12-08T20:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.973761 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.974199 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.974224 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.974242 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:19 crc kubenswrapper[4781]: I1208 20:05:19.974254 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:19Z","lastTransitionTime":"2025-12-08T20:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.077318 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.077382 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.077410 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.077438 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.077461 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:20Z","lastTransitionTime":"2025-12-08T20:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.180596 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.180655 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.180674 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.180701 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.180725 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:20Z","lastTransitionTime":"2025-12-08T20:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.283231 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.283316 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.283340 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.283372 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.283398 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:20Z","lastTransitionTime":"2025-12-08T20:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.386418 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.386486 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.386509 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.386538 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.386561 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:20Z","lastTransitionTime":"2025-12-08T20:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.489497 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.489556 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.489578 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.489607 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.489631 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:20Z","lastTransitionTime":"2025-12-08T20:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.592563 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.592609 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.592621 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.592639 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.592651 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:20Z","lastTransitionTime":"2025-12-08T20:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.694796 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.695144 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.695229 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.695307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.695388 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:20Z","lastTransitionTime":"2025-12-08T20:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.798324 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.798364 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.798375 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.798391 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.798403 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:20Z","lastTransitionTime":"2025-12-08T20:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.900898 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.901267 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.901461 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.901607 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:20 crc kubenswrapper[4781]: I1208 20:05:20.901739 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:20Z","lastTransitionTime":"2025-12-08T20:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.004200 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.004244 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.004256 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.004272 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.004286 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:21Z","lastTransitionTime":"2025-12-08T20:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.106766 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.106814 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.106827 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.106845 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.106858 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:21Z","lastTransitionTime":"2025-12-08T20:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.125457 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.125502 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:21 crc kubenswrapper[4781]: E1208 20:05:21.125567 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.125663 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.125705 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:21 crc kubenswrapper[4781]: E1208 20:05:21.125815 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:21 crc kubenswrapper[4781]: E1208 20:05:21.125954 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:21 crc kubenswrapper[4781]: E1208 20:05:21.125990 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.209203 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.209244 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.209257 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.209275 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.209289 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:21Z","lastTransitionTime":"2025-12-08T20:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.311572 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.311637 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.311656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.311696 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.311714 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:21Z","lastTransitionTime":"2025-12-08T20:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.414804 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.414835 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.414863 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.414877 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.414886 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:21Z","lastTransitionTime":"2025-12-08T20:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.516670 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.516703 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.516717 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.516757 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.516769 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:21Z","lastTransitionTime":"2025-12-08T20:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.619233 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.619283 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.619295 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.619312 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.619326 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:21Z","lastTransitionTime":"2025-12-08T20:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.721246 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.721292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.721339 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.721360 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.721373 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:21Z","lastTransitionTime":"2025-12-08T20:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.824067 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.824145 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.824171 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.824199 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.824219 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:21Z","lastTransitionTime":"2025-12-08T20:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.926472 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.926534 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.926550 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.926570 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:21 crc kubenswrapper[4781]: I1208 20:05:21.926585 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:21Z","lastTransitionTime":"2025-12-08T20:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.029291 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.029320 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.029329 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.029343 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.029352 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:22Z","lastTransitionTime":"2025-12-08T20:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.130852 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.130956 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.130970 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.130986 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.130997 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:22Z","lastTransitionTime":"2025-12-08T20:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.233554 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.233601 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.233613 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.233631 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.233644 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:22Z","lastTransitionTime":"2025-12-08T20:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.336548 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.336585 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.336594 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.336607 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.336615 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:22Z","lastTransitionTime":"2025-12-08T20:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.439976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.440386 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.441066 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.441151 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.441167 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:22Z","lastTransitionTime":"2025-12-08T20:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.545267 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.546341 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.546564 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.546774 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.546999 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:22Z","lastTransitionTime":"2025-12-08T20:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.650164 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.650228 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.650249 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.650281 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.650303 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:22Z","lastTransitionTime":"2025-12-08T20:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.752756 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.752837 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.752862 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.752888 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.752907 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:22Z","lastTransitionTime":"2025-12-08T20:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.856372 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.856443 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.856482 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.856514 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.856536 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:22Z","lastTransitionTime":"2025-12-08T20:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.960263 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.960316 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.960329 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.960343 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:22 crc kubenswrapper[4781]: I1208 20:05:22.960353 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:22Z","lastTransitionTime":"2025-12-08T20:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.063035 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.063112 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.063134 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.063159 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.063180 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:23Z","lastTransitionTime":"2025-12-08T20:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.125414 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.125464 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.125487 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:23 crc kubenswrapper[4781]: E1208 20:05:23.125635 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.125449 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:23 crc kubenswrapper[4781]: E1208 20:05:23.125815 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:23 crc kubenswrapper[4781]: E1208 20:05:23.125963 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:23 crc kubenswrapper[4781]: E1208 20:05:23.126030 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.166368 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.166683 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.166846 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.167006 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.167155 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:23Z","lastTransitionTime":"2025-12-08T20:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.270204 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.270258 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.270274 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.270298 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.270319 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:23Z","lastTransitionTime":"2025-12-08T20:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.373271 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.373324 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.373336 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.373354 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.373369 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:23Z","lastTransitionTime":"2025-12-08T20:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.475207 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.475239 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.475249 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.475266 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.475277 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:23Z","lastTransitionTime":"2025-12-08T20:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.577631 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.577681 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.577692 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.577708 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.577719 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:23Z","lastTransitionTime":"2025-12-08T20:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.680282 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.680314 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.680322 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.680335 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.680344 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:23Z","lastTransitionTime":"2025-12-08T20:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.782047 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.782137 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.782145 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.782158 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.782188 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:23Z","lastTransitionTime":"2025-12-08T20:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.884979 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.885230 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.885288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.885344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.885395 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:23Z","lastTransitionTime":"2025-12-08T20:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.987838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.988181 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.988389 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.988577 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:23 crc kubenswrapper[4781]: I1208 20:05:23.988728 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:23Z","lastTransitionTime":"2025-12-08T20:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.091050 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.091093 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.091106 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.091122 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.091134 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:24Z","lastTransitionTime":"2025-12-08T20:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.145377 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.158024 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.181999 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.192943 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.193141 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.193226 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.193304 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.193373 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:24Z","lastTransitionTime":"2025-12-08T20:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.196267 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.211127 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.224151 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.236062 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.262732 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:14Z\\\",\\\"message\\\":\\\"ddr{IP:\\\\\\\"10.217.5.219\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1208 20:05:14.126594 6193 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1208 20:05:14.126996 6193 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z]\\\\nI1208 20:05:14.126991 6193 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1208 20:05:14.127017 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.274210 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.286663 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.295245 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.295275 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.295283 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.295295 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.295304 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:24Z","lastTransitionTime":"2025-12-08T20:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.307285 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.319932 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.334111 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.353762 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.367699 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.379320 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.389542 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:24Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.397615 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.397644 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.397653 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.397667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.397678 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:24Z","lastTransitionTime":"2025-12-08T20:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.499225 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.499265 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.499277 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.499292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.499302 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:24Z","lastTransitionTime":"2025-12-08T20:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.601366 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.602033 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.602071 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.602092 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.602109 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:24Z","lastTransitionTime":"2025-12-08T20:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.704047 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.704088 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.704099 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.704117 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.704127 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:24Z","lastTransitionTime":"2025-12-08T20:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.806541 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.806578 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.806588 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.806623 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.806634 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:24Z","lastTransitionTime":"2025-12-08T20:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.909008 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.909067 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.909084 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.909107 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:24 crc kubenswrapper[4781]: I1208 20:05:24.909126 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:24Z","lastTransitionTime":"2025-12-08T20:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.012157 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.012232 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.012258 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.012285 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.012318 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:25Z","lastTransitionTime":"2025-12-08T20:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.115119 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.115147 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.115155 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.115167 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.115175 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:25Z","lastTransitionTime":"2025-12-08T20:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.125201 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:25 crc kubenswrapper[4781]: E1208 20:05:25.125326 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.125521 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.125609 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:25 crc kubenswrapper[4781]: E1208 20:05:25.125784 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.125611 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:25 crc kubenswrapper[4781]: E1208 20:05:25.125861 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:25 crc kubenswrapper[4781]: E1208 20:05:25.125995 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.217617 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.217847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.217966 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.218061 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.218145 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:25Z","lastTransitionTime":"2025-12-08T20:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.321142 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.321620 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.321733 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.321821 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.321907 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:25Z","lastTransitionTime":"2025-12-08T20:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.424434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.424694 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.424757 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.424840 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.424947 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:25Z","lastTransitionTime":"2025-12-08T20:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.526613 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.526648 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.526657 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.526673 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.526684 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:25Z","lastTransitionTime":"2025-12-08T20:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.629103 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.629682 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.629759 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.629822 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.629886 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:25Z","lastTransitionTime":"2025-12-08T20:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.732186 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.732261 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.732273 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.732290 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.732303 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:25Z","lastTransitionTime":"2025-12-08T20:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.835357 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.835422 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.835446 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.835475 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.835497 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:25Z","lastTransitionTime":"2025-12-08T20:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.937988 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.938029 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.938039 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.938053 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:25 crc kubenswrapper[4781]: I1208 20:05:25.938061 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:25Z","lastTransitionTime":"2025-12-08T20:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.040862 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.040934 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.040950 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.040972 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.040985 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:26Z","lastTransitionTime":"2025-12-08T20:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.143514 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.143772 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.143837 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.143910 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.144004 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:26Z","lastTransitionTime":"2025-12-08T20:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.246250 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.246489 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.246637 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.246792 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.246967 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:26Z","lastTransitionTime":"2025-12-08T20:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.349219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.349502 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.349595 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.349666 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.349734 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:26Z","lastTransitionTime":"2025-12-08T20:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.451755 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.452054 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.452149 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.452263 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.452349 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:26Z","lastTransitionTime":"2025-12-08T20:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.555311 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.555371 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.555383 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.555400 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.555412 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:26Z","lastTransitionTime":"2025-12-08T20:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.657770 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.657814 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.657825 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.657844 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.657856 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:26Z","lastTransitionTime":"2025-12-08T20:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.760776 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.760825 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.760836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.760855 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.760868 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:26Z","lastTransitionTime":"2025-12-08T20:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.863453 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.863734 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.863802 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.863870 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.863935 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:26Z","lastTransitionTime":"2025-12-08T20:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.967056 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.967100 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.967110 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.967130 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:26 crc kubenswrapper[4781]: I1208 20:05:26.967140 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:26Z","lastTransitionTime":"2025-12-08T20:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.072283 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.072333 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.072346 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.072365 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.072380 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.125137 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.125229 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:27 crc kubenswrapper[4781]: E1208 20:05:27.125298 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:27 crc kubenswrapper[4781]: E1208 20:05:27.125429 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.125570 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:27 crc kubenswrapper[4781]: E1208 20:05:27.125667 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.125727 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:27 crc kubenswrapper[4781]: E1208 20:05:27.125787 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.175702 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.175756 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.175769 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.175788 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.175803 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.278410 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.278458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.278473 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.278492 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.278504 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.381117 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.381160 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.381172 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.381188 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.381199 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.483561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.483667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.483681 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.483698 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.483710 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.586405 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.586450 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.586462 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.586483 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.586495 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.689061 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.689110 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.689147 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.689184 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.689203 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.746222 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.746268 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.746276 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.746291 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.746300 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: E1208 20:05:27.757435 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:27Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.760669 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.761100 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.761117 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.761130 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.761139 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: E1208 20:05:27.775434 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:27Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.779318 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.779432 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.779542 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.779619 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.779690 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: E1208 20:05:27.790739 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:27Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.794235 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.794270 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.794278 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.794292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.794305 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: E1208 20:05:27.808880 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:27Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.812469 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.812495 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.812505 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.812519 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.812529 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: E1208 20:05:27.827146 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:27Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:27 crc kubenswrapper[4781]: E1208 20:05:27.827330 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.829034 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.829086 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.829104 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.829126 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.829142 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.932234 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.932698 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.932976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.933258 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:27 crc kubenswrapper[4781]: I1208 20:05:27.933492 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:27Z","lastTransitionTime":"2025-12-08T20:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.036035 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.036304 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.036371 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.036430 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.036484 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:28Z","lastTransitionTime":"2025-12-08T20:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.138550 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.138587 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.138599 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.138615 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.138627 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:28Z","lastTransitionTime":"2025-12-08T20:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.241868 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.241985 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.242008 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.242039 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.242056 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:28Z","lastTransitionTime":"2025-12-08T20:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.344879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.344940 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.344952 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.344967 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.344976 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:28Z","lastTransitionTime":"2025-12-08T20:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.447456 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.447504 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.447517 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.447535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.447548 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:28Z","lastTransitionTime":"2025-12-08T20:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.550083 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.550165 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.550187 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.550218 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.550239 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:28Z","lastTransitionTime":"2025-12-08T20:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.652410 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.652443 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.652451 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.652464 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.652471 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:28Z","lastTransitionTime":"2025-12-08T20:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.756719 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.756790 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.756813 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.756842 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.756864 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:28Z","lastTransitionTime":"2025-12-08T20:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.859074 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.859113 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.859124 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.859138 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.859152 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:28Z","lastTransitionTime":"2025-12-08T20:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.962036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.962074 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.962085 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.962098 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:28 crc kubenswrapper[4781]: I1208 20:05:28.962106 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:28Z","lastTransitionTime":"2025-12-08T20:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.064878 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.064962 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.064974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.064989 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.065001 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:29Z","lastTransitionTime":"2025-12-08T20:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.124835 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.124889 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.124958 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.124857 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:29 crc kubenswrapper[4781]: E1208 20:05:29.125014 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:29 crc kubenswrapper[4781]: E1208 20:05:29.125148 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:29 crc kubenswrapper[4781]: E1208 20:05:29.125267 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:29 crc kubenswrapper[4781]: E1208 20:05:29.125472 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.167734 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.167772 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.167783 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.167798 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.167808 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:29Z","lastTransitionTime":"2025-12-08T20:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.269872 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.269940 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.269953 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.269971 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.269984 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:29Z","lastTransitionTime":"2025-12-08T20:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.372056 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.372087 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.372097 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.372111 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.372122 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:29Z","lastTransitionTime":"2025-12-08T20:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.474341 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.474403 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.474426 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.474457 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.474479 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:29Z","lastTransitionTime":"2025-12-08T20:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.576992 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.577123 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.577145 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.577174 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.577193 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:29Z","lastTransitionTime":"2025-12-08T20:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.680507 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.680557 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.680573 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.680593 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.680607 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:29Z","lastTransitionTime":"2025-12-08T20:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.782414 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.782460 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.782472 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.782494 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.782507 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:29Z","lastTransitionTime":"2025-12-08T20:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.884588 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.884653 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.884668 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.884686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.884697 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:29Z","lastTransitionTime":"2025-12-08T20:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.986658 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.986705 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.986720 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.986740 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:29 crc kubenswrapper[4781]: I1208 20:05:29.986757 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:29Z","lastTransitionTime":"2025-12-08T20:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.090547 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.090712 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.090738 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.090808 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.090838 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:30Z","lastTransitionTime":"2025-12-08T20:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.125836 4781 scope.go:117] "RemoveContainer" containerID="1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.193445 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.193507 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.193520 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.193540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.193554 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:30Z","lastTransitionTime":"2025-12-08T20:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.296000 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.296035 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.296059 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.296076 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.296085 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:30Z","lastTransitionTime":"2025-12-08T20:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.400544 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.400592 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.400602 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.400627 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.400637 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:30Z","lastTransitionTime":"2025-12-08T20:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.466524 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/1.log" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.469054 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerStarted","Data":"55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de"} Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.472674 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.495404 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.502267 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.502306 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.502316 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.502331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.502341 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:30Z","lastTransitionTime":"2025-12-08T20:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.515460 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.541076 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.554180 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.563734 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.572852 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.585942 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.601123 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.603942 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.603985 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.603998 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.604013 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.604024 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:30Z","lastTransitionTime":"2025-12-08T20:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.612765 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.628811 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:14Z\\\",\\\"message\\\":\\\"ddr{IP:\\\\\\\"10.217.5.219\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1208 20:05:14.126594 6193 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1208 20:05:14.126996 6193 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z]\\\\nI1208 20:05:14.126991 6193 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1208 20:05:14.127017 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.641062 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.655177 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.671746 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.687434 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.698997 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.706318 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.706366 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.706378 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.706396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.706407 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:30Z","lastTransitionTime":"2025-12-08T20:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.718376 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.730839 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:30Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.809133 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.809438 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.809447 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.809460 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.809471 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:30Z","lastTransitionTime":"2025-12-08T20:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.911606 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.911665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.911674 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.911691 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:30 crc kubenswrapper[4781]: I1208 20:05:30.911701 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:30Z","lastTransitionTime":"2025-12-08T20:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.014759 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.014808 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.014825 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.014847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.014859 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:31Z","lastTransitionTime":"2025-12-08T20:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.116885 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.116936 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.116948 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.116982 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.116992 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:31Z","lastTransitionTime":"2025-12-08T20:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.124992 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.125036 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.125036 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.125050 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:31 crc kubenswrapper[4781]: E1208 20:05:31.125144 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:31 crc kubenswrapper[4781]: E1208 20:05:31.125213 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:31 crc kubenswrapper[4781]: E1208 20:05:31.125295 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:31 crc kubenswrapper[4781]: E1208 20:05:31.125506 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.128387 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.139906 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.142837 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.154492 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.166206 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.181079 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.194239 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.207346 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.219504 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.219532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.219543 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.219564 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.219577 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:31Z","lastTransitionTime":"2025-12-08T20:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.220003 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.230285 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.241770 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.253554 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.273275 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:14Z\\\",\\\"message\\\":\\\"ddr{IP:\\\\\\\"10.217.5.219\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1208 20:05:14.126594 6193 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1208 20:05:14.126996 6193 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z]\\\\nI1208 20:05:14.126991 6193 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1208 20:05:14.127017 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.293666 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.304808 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.317710 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.321088 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.321116 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.321124 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.321164 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.321175 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:31Z","lastTransitionTime":"2025-12-08T20:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.332111 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.342736 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.354470 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.423661 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.423695 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.423704 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.423717 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.423725 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:31Z","lastTransitionTime":"2025-12-08T20:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.474808 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/2.log" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.475465 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/1.log" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.477867 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3526d83-eb7e-486e-9357-80df536d09fd" containerID="55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de" exitCode=1 Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.478903 4781 scope.go:117] "RemoveContainer" containerID="55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de" Dec 08 20:05:31 crc kubenswrapper[4781]: E1208 20:05:31.479031 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.479068 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de"} Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.479105 4781 scope.go:117] "RemoveContainer" containerID="1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.494191 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.507596 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.519568 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.525763 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.525807 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.525818 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.525834 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.525846 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:31Z","lastTransitionTime":"2025-12-08T20:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.533811 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.544548 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.553375 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.565077 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.575117 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.584725 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5807426-b332-4447-acae-956b3e59f0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595a833e27a2ce019e5a2c06b12cf8c3f36fb8377adbfab0a3ecb22adbe3a2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebffab9e6d8c675362193da6ec529e34f1fe6de539e6835f84e501b6f35138a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72af9e3ecbbd8f93747689fc6479f7c379fa06452e6fcb6f7403eca55afaaa2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.600346 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1892c11b8946f25d596c8faffccdedf7d30d9b2fe89dd98a545c6cb1e2131406\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:14Z\\\",\\\"message\\\":\\\"ddr{IP:\\\\\\\"10.217.5.219\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1208 20:05:14.126594 6193 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1208 20:05:14.126996 6193 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:14Z is after 2025-08-24T17:21:41Z]\\\\nI1208 20:05:14.126991 6193 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1208 20:05:14.127017 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI1208 20:05:30.929177 6416 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1208 20:05:30.929222 6416 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1208 20:05:30.929248 6416 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.611350 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.621221 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.628104 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.628142 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.628156 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.628172 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.628190 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:31Z","lastTransitionTime":"2025-12-08T20:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.634691 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.644359 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.654334 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.671690 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.683316 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.697589 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:31Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.730654 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.730696 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.730710 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.730724 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.730736 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:31Z","lastTransitionTime":"2025-12-08T20:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.836662 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.836707 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.836743 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.836757 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.836768 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:31Z","lastTransitionTime":"2025-12-08T20:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.939731 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.939768 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.939781 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.939795 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:31 crc kubenswrapper[4781]: I1208 20:05:31.939806 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:31Z","lastTransitionTime":"2025-12-08T20:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.044156 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.044224 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.044244 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.044269 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.044296 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:32Z","lastTransitionTime":"2025-12-08T20:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.146482 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.146524 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.146533 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.146547 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.146555 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:32Z","lastTransitionTime":"2025-12-08T20:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.250187 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.250448 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.250509 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.250571 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.250640 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:32Z","lastTransitionTime":"2025-12-08T20:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.352638 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.352681 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.352693 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.352710 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.352722 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:32Z","lastTransitionTime":"2025-12-08T20:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.455113 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.455154 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.455162 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.455182 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.455192 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:32Z","lastTransitionTime":"2025-12-08T20:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.482954 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/2.log" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.486313 4781 scope.go:117] "RemoveContainer" containerID="55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de" Dec 08 20:05:32 crc kubenswrapper[4781]: E1208 20:05:32.486463 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.500742 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.514910 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5807426-b332-4447-acae-956b3e59f0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595a833e27a2ce019e5a2c06b12cf8c3f36fb8377adbfab0a3ecb22adbe3a2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebffab9e6d8c675362193da6ec529e34f1fe6de539e6835f84e501b6f35138a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72af9e3ecbbd8f93747689fc6479f7c379fa06452e6fcb6f7403eca55afaaa2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.528310 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.541153 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.550696 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.557980 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.558015 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.558042 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.558057 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.558069 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:32Z","lastTransitionTime":"2025-12-08T20:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.564017 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.576053 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.588882 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.611300 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI1208 20:05:30.929177 6416 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1208 20:05:30.929222 6416 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1208 20:05:30.929248 6416 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.628827 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.641675 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.660930 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.661163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.661227 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.661288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.661348 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:32Z","lastTransitionTime":"2025-12-08T20:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.663874 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.679355 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.690874 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.710876 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.724662 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.735767 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.750864 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:32Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.763829 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.763882 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.763898 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.763942 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.763954 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:32Z","lastTransitionTime":"2025-12-08T20:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.866820 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.867421 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.867498 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.867567 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.867628 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:32Z","lastTransitionTime":"2025-12-08T20:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.970183 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.970401 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.970765 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.970843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:32 crc kubenswrapper[4781]: I1208 20:05:32.970943 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:32Z","lastTransitionTime":"2025-12-08T20:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.073615 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.073673 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.073691 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.073714 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.073729 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:33Z","lastTransitionTime":"2025-12-08T20:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.125475 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:33 crc kubenswrapper[4781]: E1208 20:05:33.125744 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.125618 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:33 crc kubenswrapper[4781]: E1208 20:05:33.126010 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.125566 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:33 crc kubenswrapper[4781]: E1208 20:05:33.126235 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.125650 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:33 crc kubenswrapper[4781]: E1208 20:05:33.126607 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.175847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.175882 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.175899 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.175941 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.175953 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:33Z","lastTransitionTime":"2025-12-08T20:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.283179 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.283225 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.283237 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.283253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.283266 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:33Z","lastTransitionTime":"2025-12-08T20:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.385833 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.386065 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.386133 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.386223 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.386293 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:33Z","lastTransitionTime":"2025-12-08T20:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.489031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.489093 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.489114 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.489146 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.489166 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:33Z","lastTransitionTime":"2025-12-08T20:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.591821 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.591861 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.591872 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.591886 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.591895 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:33Z","lastTransitionTime":"2025-12-08T20:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.694629 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.694704 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.694730 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.694755 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.694778 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:33Z","lastTransitionTime":"2025-12-08T20:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.797745 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.797782 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.797798 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.797816 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.797831 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:33Z","lastTransitionTime":"2025-12-08T20:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.899821 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.899880 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.899896 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.899945 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:33 crc kubenswrapper[4781]: I1208 20:05:33.899963 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:33Z","lastTransitionTime":"2025-12-08T20:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.002253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.002524 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.002703 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.002864 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.003018 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:34Z","lastTransitionTime":"2025-12-08T20:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.105120 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.105163 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.105175 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.105192 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.105204 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:34Z","lastTransitionTime":"2025-12-08T20:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.137078 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.149080 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.168459 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI1208 20:05:30.929177 6416 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1208 20:05:30.929222 6416 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1208 20:05:30.929248 6416 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.188384 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.200799 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.207668 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.207822 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.207990 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.208098 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.208178 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:34Z","lastTransitionTime":"2025-12-08T20:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.214381 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.236826 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.248811 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.264600 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.277514 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.290077 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.300355 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.310955 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.311007 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.311023 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.311047 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.311062 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:34Z","lastTransitionTime":"2025-12-08T20:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.315982 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.329750 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.341583 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5807426-b332-4447-acae-956b3e59f0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595a833e27a2ce019e5a2c06b12cf8c3f36fb8377adbfab0a3ecb22adbe3a2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebffab9e6d8c675362193da6ec529e34f1fe6de539e6835f84e501b6f35138a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72af9e3ecbbd8f93747689fc6479f7c379fa06452e6fcb6f7403eca55afaaa2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.356845 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.370521 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.381012 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:34Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.415023 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.415064 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.415074 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.415102 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.415112 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:34Z","lastTransitionTime":"2025-12-08T20:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.517444 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.517499 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.517514 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.517535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.517550 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:34Z","lastTransitionTime":"2025-12-08T20:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.620245 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.620285 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.620301 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.620319 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.620334 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:34Z","lastTransitionTime":"2025-12-08T20:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.722856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.722985 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.723027 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.723059 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.723082 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:34Z","lastTransitionTime":"2025-12-08T20:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.825532 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.825567 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.825578 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.825593 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.825604 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:34Z","lastTransitionTime":"2025-12-08T20:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.928055 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.928083 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.928092 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.928105 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.928113 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:34Z","lastTransitionTime":"2025-12-08T20:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:34 crc kubenswrapper[4781]: I1208 20:05:34.996132 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:05:34 crc kubenswrapper[4781]: E1208 20:05:34.996379 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:06:06.996361749 +0000 UTC m=+83.147645126 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.030551 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.030603 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.030628 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.030651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.030665 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:35Z","lastTransitionTime":"2025-12-08T20:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.097987 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.098062 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.098097 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.098158 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.098170 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.098181 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.098195 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.098203 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.098247 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 20:06:07.098227924 +0000 UTC m=+83.249511301 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.098292 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.098311 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:06:07.098287405 +0000 UTC m=+83.249570792 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.098364 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:06:07.098337707 +0000 UTC m=+83.249621144 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.098388 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.098432 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.098447 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.098510 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 20:06:07.098491061 +0000 UTC m=+83.249774518 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.124857 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.125042 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.124877 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.124869 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.125133 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.124893 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.125290 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.125464 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.132481 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.132577 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.132596 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.132622 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.132641 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:35Z","lastTransitionTime":"2025-12-08T20:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.235502 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.235540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.235553 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.235571 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.235584 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:35Z","lastTransitionTime":"2025-12-08T20:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.337938 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.337983 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.337995 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.338012 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.338023 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:35Z","lastTransitionTime":"2025-12-08T20:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.440148 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.440201 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.440217 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.440239 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.440256 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:35Z","lastTransitionTime":"2025-12-08T20:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.542779 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.542830 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.542842 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.542858 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.542871 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:35Z","lastTransitionTime":"2025-12-08T20:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.645323 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.645358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.645368 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.645382 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.645392 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:35Z","lastTransitionTime":"2025-12-08T20:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.705808 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.705955 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:35 crc kubenswrapper[4781]: E1208 20:05:35.706008 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs podName:c74e396c-5b68-47be-b86b-9f48c02ec760 nodeName:}" failed. No retries permitted until 2025-12-08 20:06:07.705995456 +0000 UTC m=+83.857278833 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs") pod "network-metrics-daemon-gr5xw" (UID: "c74e396c-5b68-47be-b86b-9f48c02ec760") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.747611 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.747670 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.747684 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.747701 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.747711 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:35Z","lastTransitionTime":"2025-12-08T20:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.850329 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.850388 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.850405 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.850426 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.850440 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:35Z","lastTransitionTime":"2025-12-08T20:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.952646 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.952686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.952695 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.952711 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:35 crc kubenswrapper[4781]: I1208 20:05:35.952724 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:35Z","lastTransitionTime":"2025-12-08T20:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.054992 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.055052 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.055071 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.055096 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.055113 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:36Z","lastTransitionTime":"2025-12-08T20:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.157732 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.157776 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.157785 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.157801 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.157810 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:36Z","lastTransitionTime":"2025-12-08T20:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.261540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.261617 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.261636 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.261663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.261680 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:36Z","lastTransitionTime":"2025-12-08T20:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.365005 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.365070 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.365087 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.365110 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.365127 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:36Z","lastTransitionTime":"2025-12-08T20:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.467988 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.468029 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.468041 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.468059 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.468072 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:36Z","lastTransitionTime":"2025-12-08T20:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.570737 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.570812 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.570831 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.570855 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.570873 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:36Z","lastTransitionTime":"2025-12-08T20:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.673983 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.674055 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.674078 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.674108 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.674132 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:36Z","lastTransitionTime":"2025-12-08T20:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.777408 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.777453 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.777465 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.777485 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.777498 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:36Z","lastTransitionTime":"2025-12-08T20:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.880401 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.880463 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.880479 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.880497 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.880510 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:36Z","lastTransitionTime":"2025-12-08T20:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.983120 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.983165 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.983175 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.983190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:36 crc kubenswrapper[4781]: I1208 20:05:36.983212 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:36Z","lastTransitionTime":"2025-12-08T20:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.086541 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.086595 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.086630 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.086653 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.086668 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.125572 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.125590 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:37 crc kubenswrapper[4781]: E1208 20:05:37.126388 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.125672 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.125648 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:37 crc kubenswrapper[4781]: E1208 20:05:37.126541 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:37 crc kubenswrapper[4781]: E1208 20:05:37.126675 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:37 crc kubenswrapper[4781]: E1208 20:05:37.126850 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.189207 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.189308 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.189334 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.189366 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.189391 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.292312 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.292359 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.292370 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.292387 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.292402 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.396131 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.396173 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.396181 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.396197 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.396206 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.499430 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.499484 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.499509 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.499539 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.499559 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.603025 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.603131 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.603155 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.603190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.603213 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.706542 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.706617 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.706641 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.706672 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.706696 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.810042 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.810125 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.810152 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.810181 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.810203 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.845637 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.845681 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.845690 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.845703 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.845711 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: E1208 20:05:37.866480 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:37Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.871182 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.871234 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.871244 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.871262 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.871276 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: E1208 20:05:37.890539 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:37Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.895437 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.895490 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.895507 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.895530 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.895545 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: E1208 20:05:37.909692 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:37Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.913269 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.913297 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.913308 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.913322 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.913333 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: E1208 20:05:37.924563 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:37Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.928252 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.928283 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.928291 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.928303 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.928312 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:37 crc kubenswrapper[4781]: E1208 20:05:37.938849 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:37Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:37 crc kubenswrapper[4781]: E1208 20:05:37.939030 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.940697 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.940746 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.940758 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.940775 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:37 crc kubenswrapper[4781]: I1208 20:05:37.940801 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:37Z","lastTransitionTime":"2025-12-08T20:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.043966 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.044042 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.044065 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.044093 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.044115 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:38Z","lastTransitionTime":"2025-12-08T20:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.145906 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.145956 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.145964 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.145975 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.145985 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:38Z","lastTransitionTime":"2025-12-08T20:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.248067 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.248108 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.248118 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.248134 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.248144 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:38Z","lastTransitionTime":"2025-12-08T20:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.351055 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.351090 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.351099 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.351113 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.351122 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:38Z","lastTransitionTime":"2025-12-08T20:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.453424 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.453461 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.453472 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.453487 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.453497 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:38Z","lastTransitionTime":"2025-12-08T20:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.555331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.555376 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.555387 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.555406 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.555417 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:38Z","lastTransitionTime":"2025-12-08T20:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.658569 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.658629 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.658647 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.658671 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.658690 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:38Z","lastTransitionTime":"2025-12-08T20:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.761290 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.761323 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.761331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.761344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.761353 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:38Z","lastTransitionTime":"2025-12-08T20:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.863462 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.863510 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.863521 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.863541 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.863567 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:38Z","lastTransitionTime":"2025-12-08T20:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.965687 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.965731 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.965745 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.965762 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:38 crc kubenswrapper[4781]: I1208 20:05:38.965773 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:38Z","lastTransitionTime":"2025-12-08T20:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.068464 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.068541 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.068561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.068587 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.068610 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:39Z","lastTransitionTime":"2025-12-08T20:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.125236 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.125310 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:39 crc kubenswrapper[4781]: E1208 20:05:39.125400 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.125241 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.125319 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:39 crc kubenswrapper[4781]: E1208 20:05:39.125543 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:39 crc kubenswrapper[4781]: E1208 20:05:39.125891 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:39 crc kubenswrapper[4781]: E1208 20:05:39.125935 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.171063 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.171146 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.171170 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.171207 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.171225 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:39Z","lastTransitionTime":"2025-12-08T20:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.273893 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.273976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.273992 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.274012 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.274027 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:39Z","lastTransitionTime":"2025-12-08T20:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.377026 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.377107 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.377125 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.377150 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.377169 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:39Z","lastTransitionTime":"2025-12-08T20:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.479108 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.479146 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.479157 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.479172 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.479181 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:39Z","lastTransitionTime":"2025-12-08T20:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.582237 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.582291 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.582306 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.582327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.582341 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:39Z","lastTransitionTime":"2025-12-08T20:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.685057 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.685153 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.685167 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.685184 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.685195 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:39Z","lastTransitionTime":"2025-12-08T20:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.788354 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.788392 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.788402 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.788417 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.788430 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:39Z","lastTransitionTime":"2025-12-08T20:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.891152 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.891215 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.891232 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.891253 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.891264 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:39Z","lastTransitionTime":"2025-12-08T20:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.993791 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.993826 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.993834 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.993848 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:39 crc kubenswrapper[4781]: I1208 20:05:39.993856 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:39Z","lastTransitionTime":"2025-12-08T20:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.097542 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.097616 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.097630 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.097651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.097663 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:40Z","lastTransitionTime":"2025-12-08T20:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.200809 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.200855 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.200880 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.200901 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.200938 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:40Z","lastTransitionTime":"2025-12-08T20:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.303971 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.304027 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.304047 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.304072 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.304094 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:40Z","lastTransitionTime":"2025-12-08T20:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.407011 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.407059 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.407082 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.407105 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.407134 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:40Z","lastTransitionTime":"2025-12-08T20:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.510164 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.510210 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.510220 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.510233 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.510243 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:40Z","lastTransitionTime":"2025-12-08T20:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.613838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.613873 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.613882 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.613896 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.613907 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:40Z","lastTransitionTime":"2025-12-08T20:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.717063 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.717134 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.717152 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.717179 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.717196 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:40Z","lastTransitionTime":"2025-12-08T20:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.820693 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.820768 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.820825 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.820854 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.820871 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:40Z","lastTransitionTime":"2025-12-08T20:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.923749 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.923811 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.923822 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.923842 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:40 crc kubenswrapper[4781]: I1208 20:05:40.923852 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:40Z","lastTransitionTime":"2025-12-08T20:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.026589 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.026653 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.026672 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.026699 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.026717 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:41Z","lastTransitionTime":"2025-12-08T20:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.124849 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:41 crc kubenswrapper[4781]: E1208 20:05:41.125086 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.125735 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.125913 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.125981 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:41 crc kubenswrapper[4781]: E1208 20:05:41.126263 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:41 crc kubenswrapper[4781]: E1208 20:05:41.126335 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:41 crc kubenswrapper[4781]: E1208 20:05:41.126507 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.129993 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.130034 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.130050 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.130071 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.130087 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:41Z","lastTransitionTime":"2025-12-08T20:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.233832 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.233903 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.233964 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.233988 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.234003 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:41Z","lastTransitionTime":"2025-12-08T20:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.337104 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.337151 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.337165 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.337210 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.337220 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:41Z","lastTransitionTime":"2025-12-08T20:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.439379 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.439499 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.439541 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.439589 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.439602 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:41Z","lastTransitionTime":"2025-12-08T20:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.542696 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.542773 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.542788 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.542810 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.542825 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:41Z","lastTransitionTime":"2025-12-08T20:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.645862 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.645976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.646007 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.646040 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.646062 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:41Z","lastTransitionTime":"2025-12-08T20:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.749352 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.749425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.749441 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.749469 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.749486 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:41Z","lastTransitionTime":"2025-12-08T20:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.852560 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.852623 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.852635 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.852658 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.852672 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:41Z","lastTransitionTime":"2025-12-08T20:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.954986 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.955035 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.955048 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.955066 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:41 crc kubenswrapper[4781]: I1208 20:05:41.955079 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:41Z","lastTransitionTime":"2025-12-08T20:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.057511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.057561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.057572 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.057590 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.057602 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:42Z","lastTransitionTime":"2025-12-08T20:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.160540 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.160612 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.160637 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.160668 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.160690 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:42Z","lastTransitionTime":"2025-12-08T20:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.263629 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.263700 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.263713 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.263762 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.263774 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:42Z","lastTransitionTime":"2025-12-08T20:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.365603 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.365657 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.365667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.365688 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.365713 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:42Z","lastTransitionTime":"2025-12-08T20:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.468822 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.468889 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.468904 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.468961 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.468976 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:42Z","lastTransitionTime":"2025-12-08T20:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.571506 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.571555 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.571568 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.571588 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.571601 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:42Z","lastTransitionTime":"2025-12-08T20:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.674387 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.674643 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.674759 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.674848 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.674937 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:42Z","lastTransitionTime":"2025-12-08T20:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.777488 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.777538 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.777552 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.777570 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.777583 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:42Z","lastTransitionTime":"2025-12-08T20:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.880392 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.880454 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.880466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.880485 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.880497 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:42Z","lastTransitionTime":"2025-12-08T20:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.983418 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.983476 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.983494 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.983517 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:42 crc kubenswrapper[4781]: I1208 20:05:42.983534 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:42Z","lastTransitionTime":"2025-12-08T20:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.087108 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.087158 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.087168 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.087190 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.087200 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:43Z","lastTransitionTime":"2025-12-08T20:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.125017 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.125079 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.125051 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.125017 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:43 crc kubenswrapper[4781]: E1208 20:05:43.125234 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:43 crc kubenswrapper[4781]: E1208 20:05:43.125326 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:43 crc kubenswrapper[4781]: E1208 20:05:43.125460 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:43 crc kubenswrapper[4781]: E1208 20:05:43.125540 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.190554 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.190632 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.190640 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.190656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.190667 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:43Z","lastTransitionTime":"2025-12-08T20:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.293122 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.293523 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.293725 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.293965 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.294133 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:43Z","lastTransitionTime":"2025-12-08T20:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.397234 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.397278 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.397295 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.397318 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.397335 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:43Z","lastTransitionTime":"2025-12-08T20:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.499981 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.500023 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.500033 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.500047 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.500057 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:43Z","lastTransitionTime":"2025-12-08T20:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.602698 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.602747 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.602759 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.602776 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.602788 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:43Z","lastTransitionTime":"2025-12-08T20:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.706076 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.706140 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.706161 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.706187 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.706204 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:43Z","lastTransitionTime":"2025-12-08T20:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.808562 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.808620 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.808632 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.808650 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.808664 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:43Z","lastTransitionTime":"2025-12-08T20:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.911312 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.911380 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.911400 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.911425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:43 crc kubenswrapper[4781]: I1208 20:05:43.911444 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:43Z","lastTransitionTime":"2025-12-08T20:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.013991 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.014048 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.014065 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.014088 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.014104 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:44Z","lastTransitionTime":"2025-12-08T20:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.117161 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.117207 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.117225 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.117244 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.117259 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:44Z","lastTransitionTime":"2025-12-08T20:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.125903 4781 scope.go:117] "RemoveContainer" containerID="55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de" Dec 08 20:05:44 crc kubenswrapper[4781]: E1208 20:05:44.126221 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.142110 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.156619 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.168991 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.182569 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.197470 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.213155 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.220123 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.220168 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.220184 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.220203 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.220217 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:44Z","lastTransitionTime":"2025-12-08T20:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.228442 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.239892 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5807426-b332-4447-acae-956b3e59f0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595a833e27a2ce019e5a2c06b12cf8c3f36fb8377adbfab0a3ecb22adbe3a2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebffab9e6d8c675362193da6ec529e34f1fe6de539e6835f84e501b6f35138a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72af9e3ecbbd8f93747689fc6479f7c379fa06452e6fcb6f7403eca55afaaa2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.252764 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.265599 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.279138 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.297548 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI1208 20:05:30.929177 6416 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1208 20:05:30.929222 6416 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1208 20:05:30.929248 6416 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.308562 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.319196 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.323460 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.323492 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.323500 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.323514 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.323522 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:44Z","lastTransitionTime":"2025-12-08T20:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.337967 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.349179 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.372310 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.401623 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:44Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.425857 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.425895 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.425905 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.425938 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.425947 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:44Z","lastTransitionTime":"2025-12-08T20:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.528223 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.528265 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.528280 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.528303 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.528320 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:44Z","lastTransitionTime":"2025-12-08T20:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.631699 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.631749 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.631765 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.631789 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.631804 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:44Z","lastTransitionTime":"2025-12-08T20:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.734243 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.734288 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.734299 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.734335 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.734347 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:44Z","lastTransitionTime":"2025-12-08T20:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.837939 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.837987 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.837999 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.838015 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.838027 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:44Z","lastTransitionTime":"2025-12-08T20:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.940284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.940346 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.940356 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.940371 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:44 crc kubenswrapper[4781]: I1208 20:05:44.940380 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:44Z","lastTransitionTime":"2025-12-08T20:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.043878 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.043912 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.043944 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.043990 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.044003 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:45Z","lastTransitionTime":"2025-12-08T20:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.125627 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.125635 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.125718 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.125722 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:45 crc kubenswrapper[4781]: E1208 20:05:45.125848 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:45 crc kubenswrapper[4781]: E1208 20:05:45.125970 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:45 crc kubenswrapper[4781]: E1208 20:05:45.126057 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:45 crc kubenswrapper[4781]: E1208 20:05:45.126113 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.146286 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.146318 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.146327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.146342 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.146353 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:45Z","lastTransitionTime":"2025-12-08T20:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.248763 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.248838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.248858 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.248883 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.248901 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:45Z","lastTransitionTime":"2025-12-08T20:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.351986 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.352067 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.352114 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.352148 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.352225 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:45Z","lastTransitionTime":"2025-12-08T20:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.454853 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.454892 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.454904 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.454943 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.454957 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:45Z","lastTransitionTime":"2025-12-08T20:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.558805 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.558873 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.558888 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.558907 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.558940 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:45Z","lastTransitionTime":"2025-12-08T20:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.661880 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.661971 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.661990 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.662014 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.662031 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:45Z","lastTransitionTime":"2025-12-08T20:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.764025 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.764054 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.764062 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.764074 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.764082 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:45Z","lastTransitionTime":"2025-12-08T20:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.865959 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.865987 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.865994 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.866006 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.866015 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:45Z","lastTransitionTime":"2025-12-08T20:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.968724 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.968769 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.968779 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.968795 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:45 crc kubenswrapper[4781]: I1208 20:05:45.968806 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:45Z","lastTransitionTime":"2025-12-08T20:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.072146 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.072225 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.072248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.072317 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.072340 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:46Z","lastTransitionTime":"2025-12-08T20:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.174446 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.174479 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.174489 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.174504 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.174514 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:46Z","lastTransitionTime":"2025-12-08T20:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.277784 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.277842 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.277866 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.277893 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.277953 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:46Z","lastTransitionTime":"2025-12-08T20:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.380838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.380911 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.381007 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.381031 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.381048 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:46Z","lastTransitionTime":"2025-12-08T20:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.483512 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.483550 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.483562 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.483577 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.483589 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:46Z","lastTransitionTime":"2025-12-08T20:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.585862 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.585900 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.585910 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.585945 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.585954 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:46Z","lastTransitionTime":"2025-12-08T20:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.688058 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.688183 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.688219 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.688248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.688272 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:46Z","lastTransitionTime":"2025-12-08T20:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.791610 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.791680 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.791699 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.791725 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.791743 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:46Z","lastTransitionTime":"2025-12-08T20:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.894515 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.894549 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.894557 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.894571 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.894582 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:46Z","lastTransitionTime":"2025-12-08T20:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.996729 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.996773 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.996784 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.996801 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:46 crc kubenswrapper[4781]: I1208 20:05:46.996813 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:46Z","lastTransitionTime":"2025-12-08T20:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.099166 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.099212 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.099223 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.099239 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.099251 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:47Z","lastTransitionTime":"2025-12-08T20:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.125792 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:47 crc kubenswrapper[4781]: E1208 20:05:47.125976 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.126164 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.126300 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.126250 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:47 crc kubenswrapper[4781]: E1208 20:05:47.126536 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:47 crc kubenswrapper[4781]: E1208 20:05:47.126651 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:47 crc kubenswrapper[4781]: E1208 20:05:47.126844 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.201769 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.201809 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.201821 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.201839 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.201852 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:47Z","lastTransitionTime":"2025-12-08T20:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.304936 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.305577 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.305693 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.305796 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.305914 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:47Z","lastTransitionTime":"2025-12-08T20:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.408336 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.408392 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.408406 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.408427 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.408444 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:47Z","lastTransitionTime":"2025-12-08T20:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.510323 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.510366 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.510378 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.510395 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.510407 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:47Z","lastTransitionTime":"2025-12-08T20:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.613007 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.613061 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.613074 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.613113 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.613127 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:47Z","lastTransitionTime":"2025-12-08T20:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.715217 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.715254 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.715264 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.715277 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.715286 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:47Z","lastTransitionTime":"2025-12-08T20:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.817624 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.817677 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.817692 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.817712 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.817726 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:47Z","lastTransitionTime":"2025-12-08T20:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.921181 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.921244 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.921267 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.921293 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:47 crc kubenswrapper[4781]: I1208 20:05:47.921315 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:47Z","lastTransitionTime":"2025-12-08T20:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.023857 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.023899 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.023909 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.023957 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.023970 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.126650 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.126688 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.126701 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.126718 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.126730 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.203798 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.203871 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.203883 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.203896 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.203906 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: E1208 20:05:48.219056 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:48Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.222635 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.222667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.222677 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.222693 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.222703 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: E1208 20:05:48.235050 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:48Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.238889 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.238930 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.238942 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.238957 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.238969 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: E1208 20:05:48.249981 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:48Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.253284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.253312 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.253322 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.253335 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.253346 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: E1208 20:05:48.266406 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:48Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.270194 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.270220 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.270230 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.270246 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.270257 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: E1208 20:05:48.280882 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:48Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:48 crc kubenswrapper[4781]: E1208 20:05:48.281048 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.282904 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.282939 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.282947 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.282960 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.282969 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.385351 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.385390 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.385403 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.385419 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.385430 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.486973 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.487012 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.487020 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.487033 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.487042 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.589746 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.589784 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.589795 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.589813 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.589826 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.692040 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.692083 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.692094 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.692292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.692309 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.794473 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.794511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.794519 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.794537 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.794549 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.896607 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.896644 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.896663 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.896679 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.896689 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.999189 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.999237 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.999247 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.999261 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:48 crc kubenswrapper[4781]: I1208 20:05:48.999270 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:48Z","lastTransitionTime":"2025-12-08T20:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.101976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.102014 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.102022 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.102036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.102047 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:49Z","lastTransitionTime":"2025-12-08T20:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.125786 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:49 crc kubenswrapper[4781]: E1208 20:05:49.125890 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.126043 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:49 crc kubenswrapper[4781]: E1208 20:05:49.126087 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.126181 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:49 crc kubenswrapper[4781]: E1208 20:05:49.126224 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.126321 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:49 crc kubenswrapper[4781]: E1208 20:05:49.126365 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.204170 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.204204 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.204214 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.204227 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.204237 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:49Z","lastTransitionTime":"2025-12-08T20:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.306569 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.306638 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.306651 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.306668 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.306680 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:49Z","lastTransitionTime":"2025-12-08T20:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.409403 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.409480 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.409503 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.409530 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.409551 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:49Z","lastTransitionTime":"2025-12-08T20:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.511388 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.511417 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.511425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.511437 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.511446 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:49Z","lastTransitionTime":"2025-12-08T20:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.614055 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.614115 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.614126 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.614145 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.614157 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:49Z","lastTransitionTime":"2025-12-08T20:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.716333 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.716380 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.716390 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.716402 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.716410 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:49Z","lastTransitionTime":"2025-12-08T20:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.818512 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.818556 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.818569 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.818585 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.818597 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:49Z","lastTransitionTime":"2025-12-08T20:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.921006 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.921042 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.921056 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.921073 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:49 crc kubenswrapper[4781]: I1208 20:05:49.921085 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:49Z","lastTransitionTime":"2025-12-08T20:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.023975 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.024014 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.024023 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.024039 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.024047 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:50Z","lastTransitionTime":"2025-12-08T20:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.126318 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.126371 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.126387 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.126409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.126425 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:50Z","lastTransitionTime":"2025-12-08T20:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.229248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.229281 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.229290 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.229305 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.229315 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:50Z","lastTransitionTime":"2025-12-08T20:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.333117 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.333196 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.333207 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.333228 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.333239 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:50Z","lastTransitionTime":"2025-12-08T20:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.435160 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.435187 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.435196 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.435208 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.435216 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:50Z","lastTransitionTime":"2025-12-08T20:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.537838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.537880 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.537889 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.537903 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.537913 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:50Z","lastTransitionTime":"2025-12-08T20:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.547147 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tm5z7_a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20/kube-multus/0.log" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.547200 4781 generic.go:334] "Generic (PLEG): container finished" podID="a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20" containerID="4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8" exitCode=1 Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.547236 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tm5z7" event={"ID":"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20","Type":"ContainerDied","Data":"4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8"} Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.547726 4781 scope.go:117] "RemoveContainer" containerID="4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.560208 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.572235 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:50Z\\\",\\\"message\\\":\\\"2025-12-08T20:05:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69\\\\n2025-12-08T20:05:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69 to /host/opt/cni/bin/\\\\n2025-12-08T20:05:05Z [verbose] multus-daemon started\\\\n2025-12-08T20:05:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T20:05:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.582177 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.597641 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.610059 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.620502 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5807426-b332-4447-acae-956b3e59f0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595a833e27a2ce019e5a2c06b12cf8c3f36fb8377adbfab0a3ecb22adbe3a2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebffab9e6d8c675362193da6ec529e34f1fe6de539e6835f84e501b6f35138a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72af9e3ecbbd8f93747689fc6479f7c379fa06452e6fcb6f7403eca55afaaa2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.632383 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.639946 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.640023 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.640033 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.640047 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.640056 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:50Z","lastTransitionTime":"2025-12-08T20:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.642298 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.650736 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.661019 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.671790 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.687625 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI1208 20:05:30.929177 6416 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1208 20:05:30.929222 6416 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1208 20:05:30.929248 6416 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.707459 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.719938 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.732650 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.742466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.742689 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.742820 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.742962 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.743067 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:50Z","lastTransitionTime":"2025-12-08T20:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.746002 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.755590 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.767455 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:50Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.845155 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.845185 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.845194 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.845208 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.845217 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:50Z","lastTransitionTime":"2025-12-08T20:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.947366 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.947403 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.947413 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.947429 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:50 crc kubenswrapper[4781]: I1208 20:05:50.947440 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:50Z","lastTransitionTime":"2025-12-08T20:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.049836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.049876 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.049887 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.049905 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.049934 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:51Z","lastTransitionTime":"2025-12-08T20:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.124868 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.125015 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:51 crc kubenswrapper[4781]: E1208 20:05:51.125040 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.124879 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:51 crc kubenswrapper[4781]: E1208 20:05:51.125156 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.125016 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:51 crc kubenswrapper[4781]: E1208 20:05:51.125340 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:51 crc kubenswrapper[4781]: E1208 20:05:51.125433 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.151850 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.152423 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.152528 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.152615 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.152700 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:51Z","lastTransitionTime":"2025-12-08T20:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.255050 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.255316 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.255398 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.255465 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.255526 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:51Z","lastTransitionTime":"2025-12-08T20:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.361347 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.361393 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.361402 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.361419 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.361429 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:51Z","lastTransitionTime":"2025-12-08T20:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.463936 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.464002 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.464018 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.464039 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.464056 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:51Z","lastTransitionTime":"2025-12-08T20:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.555211 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tm5z7_a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20/kube-multus/0.log" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.555263 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tm5z7" event={"ID":"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20","Type":"ContainerStarted","Data":"0a19a77a66bede932c0b113bf2257987ba53c17493c3db9eee6b2869f005dd4a"} Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.566008 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.566040 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.566047 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.566060 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.566069 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:51Z","lastTransitionTime":"2025-12-08T20:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.570347 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.582623 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.594807 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a19a77a66bede932c0b113bf2257987ba53c17493c3db9eee6b2869f005dd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:50Z\\\",\\\"message\\\":\\\"2025-12-08T20:05:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69\\\\n2025-12-08T20:05:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69 to /host/opt/cni/bin/\\\\n2025-12-08T20:05:05Z [verbose] multus-daemon started\\\\n2025-12-08T20:05:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T20:05:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.606871 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5807426-b332-4447-acae-956b3e59f0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595a833e27a2ce019e5a2c06b12cf8c3f36fb8377adbfab0a3ecb22adbe3a2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebffab9e6d8c675362193da6ec529e34f1fe6de539e6835f84e501b6f35138a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72af9e3ecbbd8f93747689fc6479f7c379fa06452e6fcb6f7403eca55afaaa2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.621026 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.632053 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.642139 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.655513 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.666765 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.668518 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.668550 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.668559 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.668575 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.668586 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:51Z","lastTransitionTime":"2025-12-08T20:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.681157 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.700302 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI1208 20:05:30.929177 6416 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1208 20:05:30.929222 6416 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1208 20:05:30.929248 6416 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.712150 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.723426 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.737591 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.749246 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.759423 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.770856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.771062 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.771087 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.771101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.771111 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:51Z","lastTransitionTime":"2025-12-08T20:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.776309 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.786574 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:51Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.873655 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.873694 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.873746 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.873760 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.873769 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:51Z","lastTransitionTime":"2025-12-08T20:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.975719 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.975976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.976055 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.976124 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:51 crc kubenswrapper[4781]: I1208 20:05:51.976185 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:51Z","lastTransitionTime":"2025-12-08T20:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.078365 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.078414 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.078425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.078437 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.078446 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:52Z","lastTransitionTime":"2025-12-08T20:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.180807 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.180850 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.180862 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.180879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.180890 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:52Z","lastTransitionTime":"2025-12-08T20:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.283341 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.283385 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.283395 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.283408 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.283420 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:52Z","lastTransitionTime":"2025-12-08T20:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.385793 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.385840 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.385863 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.385890 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.385947 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:52Z","lastTransitionTime":"2025-12-08T20:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.488943 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.488980 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.488991 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.489008 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.489019 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:52Z","lastTransitionTime":"2025-12-08T20:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.591932 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.591984 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.592002 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.592024 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.592044 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:52Z","lastTransitionTime":"2025-12-08T20:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.693676 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.693706 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.693717 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.693730 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.693741 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:52Z","lastTransitionTime":"2025-12-08T20:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.795759 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.795831 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.795854 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.795884 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.795909 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:52Z","lastTransitionTime":"2025-12-08T20:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.898771 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.898842 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.898864 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.898894 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:52 crc kubenswrapper[4781]: I1208 20:05:52.898951 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:52Z","lastTransitionTime":"2025-12-08T20:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.003743 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.003833 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.003895 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.003948 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.003970 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:53Z","lastTransitionTime":"2025-12-08T20:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.105948 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.106221 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.106312 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.106411 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.106510 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:53Z","lastTransitionTime":"2025-12-08T20:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.125231 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:53 crc kubenswrapper[4781]: E1208 20:05:53.125336 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.125400 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.125450 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.125242 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:53 crc kubenswrapper[4781]: E1208 20:05:53.125552 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:53 crc kubenswrapper[4781]: E1208 20:05:53.125586 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:53 crc kubenswrapper[4781]: E1208 20:05:53.125628 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.208582 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.208811 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.208908 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.209032 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.209123 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:53Z","lastTransitionTime":"2025-12-08T20:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.310937 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.311208 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.311294 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.311369 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.311440 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:53Z","lastTransitionTime":"2025-12-08T20:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.413196 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.413250 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.413263 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.413283 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.413294 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:53Z","lastTransitionTime":"2025-12-08T20:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.515426 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.516021 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.516120 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.516214 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.516295 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:53Z","lastTransitionTime":"2025-12-08T20:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.618036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.618075 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.618086 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.618100 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.618111 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:53Z","lastTransitionTime":"2025-12-08T20:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.720145 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.720183 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.720193 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.720209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.720221 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:53Z","lastTransitionTime":"2025-12-08T20:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.822445 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.822756 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.822883 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.823012 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.823118 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:53Z","lastTransitionTime":"2025-12-08T20:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.925591 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.925635 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.925647 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.925668 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:53 crc kubenswrapper[4781]: I1208 20:05:53.925686 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:53Z","lastTransitionTime":"2025-12-08T20:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.027713 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.027760 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.027770 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.027786 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.027796 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:54Z","lastTransitionTime":"2025-12-08T20:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.130118 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.130150 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.130161 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.130197 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.130208 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:54Z","lastTransitionTime":"2025-12-08T20:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.169255 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI1208 20:05:30.929177 6416 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1208 20:05:30.929222 6416 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1208 20:05:30.929248 6416 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.180162 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.191324 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.206264 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.218103 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.230701 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.232094 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.232134 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.232144 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.232159 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.232169 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:54Z","lastTransitionTime":"2025-12-08T20:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.254981 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.269551 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.280881 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.291271 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.303411 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a19a77a66bede932c0b113bf2257987ba53c17493c3db9eee6b2869f005dd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:50Z\\\",\\\"message\\\":\\\"2025-12-08T20:05:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69\\\\n2025-12-08T20:05:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69 to /host/opt/cni/bin/\\\\n2025-12-08T20:05:05Z [verbose] multus-daemon started\\\\n2025-12-08T20:05:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T20:05:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.312884 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.327465 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.334551 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.334589 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.334601 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.334615 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.334626 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:54Z","lastTransitionTime":"2025-12-08T20:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.339571 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.348437 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.367368 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.377866 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.388708 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5807426-b332-4447-acae-956b3e59f0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595a833e27a2ce019e5a2c06b12cf8c3f36fb8377adbfab0a3ecb22adbe3a2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebffab9e6d8c675362193da6ec529e34f1fe6de539e6835f84e501b6f35138a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72af9e3ecbbd8f93747689fc6479f7c379fa06452e6fcb6f7403eca55afaaa2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:54Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.436343 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.436501 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.436538 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.436569 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.436590 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:54Z","lastTransitionTime":"2025-12-08T20:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.539128 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.539164 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.539172 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.539188 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.539197 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:54Z","lastTransitionTime":"2025-12-08T20:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.641155 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.641183 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.641192 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.641204 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.641213 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:54Z","lastTransitionTime":"2025-12-08T20:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.743498 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.743533 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.743558 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.743572 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.743583 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:54Z","lastTransitionTime":"2025-12-08T20:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.845774 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.845815 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.845826 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.845840 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.845851 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:54Z","lastTransitionTime":"2025-12-08T20:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.948631 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.948686 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.948696 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.948713 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:54 crc kubenswrapper[4781]: I1208 20:05:54.948724 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:54Z","lastTransitionTime":"2025-12-08T20:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.050878 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.050946 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.050959 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.050975 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.051007 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:55Z","lastTransitionTime":"2025-12-08T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.124996 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.125134 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.125236 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:55 crc kubenswrapper[4781]: E1208 20:05:55.125231 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.125290 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:55 crc kubenswrapper[4781]: E1208 20:05:55.125365 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:55 crc kubenswrapper[4781]: E1208 20:05:55.125409 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:55 crc kubenswrapper[4781]: E1208 20:05:55.125477 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.153737 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.153786 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.153798 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.153814 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.153826 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:55Z","lastTransitionTime":"2025-12-08T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.256815 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.256872 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.256884 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.256903 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.256938 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:55Z","lastTransitionTime":"2025-12-08T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.359409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.359448 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.359456 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.359473 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.359483 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:55Z","lastTransitionTime":"2025-12-08T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.461103 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.461149 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.461161 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.461183 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.461195 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:55Z","lastTransitionTime":"2025-12-08T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.564604 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.564638 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.564648 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.564662 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.564672 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:55Z","lastTransitionTime":"2025-12-08T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.666248 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.666276 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.666286 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.666298 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.666305 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:55Z","lastTransitionTime":"2025-12-08T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.768058 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.768111 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.768126 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.768145 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.768158 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:55Z","lastTransitionTime":"2025-12-08T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.870385 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.870432 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.870443 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.870459 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.870471 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:55Z","lastTransitionTime":"2025-12-08T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.972856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.972910 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.972960 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.972986 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:55 crc kubenswrapper[4781]: I1208 20:05:55.973004 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:55Z","lastTransitionTime":"2025-12-08T20:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.076283 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.076317 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.076326 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.076338 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.076348 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:56Z","lastTransitionTime":"2025-12-08T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.179127 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.179178 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.179188 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.179206 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.179216 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:56Z","lastTransitionTime":"2025-12-08T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.281523 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.281568 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.281578 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.281593 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.281604 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:56Z","lastTransitionTime":"2025-12-08T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.384497 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.384538 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.384548 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.384561 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.384570 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:56Z","lastTransitionTime":"2025-12-08T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.486442 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.486480 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.486488 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.486523 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.486533 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:56Z","lastTransitionTime":"2025-12-08T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.588886 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.588951 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.588960 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.588977 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.588987 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:56Z","lastTransitionTime":"2025-12-08T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.691006 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.691046 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.691058 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.691074 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.691084 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:56Z","lastTransitionTime":"2025-12-08T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.793272 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.793311 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.793333 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.793348 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.793357 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:56Z","lastTransitionTime":"2025-12-08T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.895734 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.895800 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.895815 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.895841 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.895864 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:56Z","lastTransitionTime":"2025-12-08T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.998350 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.998414 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.998424 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.998446 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:56 crc kubenswrapper[4781]: I1208 20:05:56.998460 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:56Z","lastTransitionTime":"2025-12-08T20:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.101028 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.101189 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.101209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.101237 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.101262 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:57Z","lastTransitionTime":"2025-12-08T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.125492 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.125564 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:57 crc kubenswrapper[4781]: E1208 20:05:57.125599 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.125629 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:57 crc kubenswrapper[4781]: E1208 20:05:57.125673 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:57 crc kubenswrapper[4781]: E1208 20:05:57.125736 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.125774 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:57 crc kubenswrapper[4781]: E1208 20:05:57.125837 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.204014 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.204082 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.204104 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.204132 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.204156 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:57Z","lastTransitionTime":"2025-12-08T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.306657 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.306729 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.306747 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.306772 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.306795 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:57Z","lastTransitionTime":"2025-12-08T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.409153 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.409198 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.409209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.409226 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.409238 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:57Z","lastTransitionTime":"2025-12-08T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.515833 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.515888 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.515904 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.515955 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.515974 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:57Z","lastTransitionTime":"2025-12-08T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.618778 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.618827 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.618838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.618855 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.618866 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:57Z","lastTransitionTime":"2025-12-08T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.721638 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.721700 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.721712 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.721730 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.721742 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:57Z","lastTransitionTime":"2025-12-08T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.824199 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.824257 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.824277 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.824300 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.824325 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:57Z","lastTransitionTime":"2025-12-08T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.926347 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.926384 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.926395 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.926410 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:57 crc kubenswrapper[4781]: I1208 20:05:57.926421 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:57Z","lastTransitionTime":"2025-12-08T20:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.028639 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.028683 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.028700 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.028717 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.028734 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.126016 4781 scope.go:117] "RemoveContainer" containerID="55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.149715 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.149755 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.149767 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.149782 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.149797 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.251969 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.252004 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.252015 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.252030 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.252039 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.354085 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.354113 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.354124 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.354140 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.354152 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.460604 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.460659 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.460668 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.460684 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.460694 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.563003 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.563053 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.563072 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.563101 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.563120 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.576379 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/2.log" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.579081 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerStarted","Data":"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.580372 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.591352 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a19a77a66bede932c0b113bf2257987ba53c17493c3db9eee6b2869f005dd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:50Z\\\",\\\"message\\\":\\\"2025-12-08T20:05:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69\\\\n2025-12-08T20:05:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69 to /host/opt/cni/bin/\\\\n2025-12-08T20:05:05Z [verbose] multus-daemon started\\\\n2025-12-08T20:05:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T20:05:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.599289 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.611076 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.612125 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.612156 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.612169 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.612186 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.612197 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: E1208 20:05:58.625530 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.628213 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.629702 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.629743 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.629759 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.629776 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.630216 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.639511 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5807426-b332-4447-acae-956b3e59f0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595a833e27a2ce019e5a2c06b12cf8c3f36fb8377adbfab0a3ecb22adbe3a2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebffab9e6d8c675362193da6ec529e34f1fe6de539e6835f84e501b6f35138a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72af9e3ecbbd8f93747689fc6479f7c379fa06452e6fcb6f7403eca55afaaa2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: E1208 20:05:58.646396 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.650535 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.650563 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.650571 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.650584 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.650593 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.658659 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: E1208 20:05:58.663119 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.668472 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.668496 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.668511 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.668526 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.668534 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.678351 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: E1208 20:05:58.680476 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.683586 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.683612 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.683622 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.683638 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.683649 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.692045 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: E1208 20:05:58.700759 4781 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d25e13ca-71aa-4676-b902-e9f68902b8c8\\\",\\\"systemUUID\\\":\\\"b452ca71-9514-4136-b425-cea2dc682adc\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: E1208 20:05:58.700984 4781 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.702786 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.702824 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.702833 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.702847 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.702859 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.710042 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.720255 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.732902 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.750660 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI1208 20:05:30.929177 6416 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1208 20:05:30.929222 6416 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1208 20:05:30.929248 6416 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.762603 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.772769 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.785390 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.794239 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.805066 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.805105 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.805117 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.805138 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.805150 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.806236 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.825225 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:58Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.911358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.911409 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.911423 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.911442 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:58 crc kubenswrapper[4781]: I1208 20:05:58.911454 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:58Z","lastTransitionTime":"2025-12-08T20:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.013425 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.013460 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.013471 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.013486 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.013498 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:59Z","lastTransitionTime":"2025-12-08T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.116016 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.116056 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.116067 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.116081 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.116092 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:59Z","lastTransitionTime":"2025-12-08T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.125496 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.125595 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:05:59 crc kubenswrapper[4781]: E1208 20:05:59.125636 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.125649 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.125767 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:05:59 crc kubenswrapper[4781]: E1208 20:05:59.125816 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:05:59 crc kubenswrapper[4781]: E1208 20:05:59.125883 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:05:59 crc kubenswrapper[4781]: E1208 20:05:59.125988 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.218167 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.218202 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.218213 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.218227 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.218239 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:59Z","lastTransitionTime":"2025-12-08T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.321755 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.321819 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.321899 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.321976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.322000 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:59Z","lastTransitionTime":"2025-12-08T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.424568 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.424625 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.424637 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.424650 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.424657 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:59Z","lastTransitionTime":"2025-12-08T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.526741 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.526774 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.526784 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.526796 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.526821 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:59Z","lastTransitionTime":"2025-12-08T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.583450 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/3.log" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.584192 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/2.log" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.586481 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3526d83-eb7e-486e-9357-80df536d09fd" containerID="d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2" exitCode=1 Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.586532 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2"} Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.586623 4781 scope.go:117] "RemoveContainer" containerID="55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.587680 4781 scope.go:117] "RemoveContainer" containerID="d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2" Dec 08 20:05:59 crc kubenswrapper[4781]: E1208 20:05:59.596177 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.602677 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.617145 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.629397 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.629441 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.629453 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.629469 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.629482 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:59Z","lastTransitionTime":"2025-12-08T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.632431 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.649644 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.666771 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.681400 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5807426-b332-4447-acae-956b3e59f0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595a833e27a2ce019e5a2c06b12cf8c3f36fb8377adbfab0a3ecb22adbe3a2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebffab9e6d8c675362193da6ec529e34f1fe6de539e6835f84e501b6f35138a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72af9e3ecbbd8f93747689fc6479f7c379fa06452e6fcb6f7403eca55afaaa2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.698993 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55fb44c2576add59b037c3c93affd51c4d0d77be8a1727c7ea00cee5edaab6de\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"message\\\":\\\" UUID: UUIDName:}]\\\\nI1208 20:05:30.929177 6416 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1208 20:05:30.929222 6416 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1208 20:05:30.929248 6416 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:59Z\\\",\\\"message\\\":\\\"6814 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:59.117802 6814 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1208 20:05:59.117947 6814 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 20:05:59.118198 6814 factory.go:656] Stopping watch factory\\\\nI1208 20:05:59.118443 6814 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 20:05:59.119163 6814 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 20:05:59.119346 6814 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1208 20:05:59.119875 6814 ovnkube.go:599] Stopped ovnkube\\\\nI1208 20:05:59.119907 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1208 20:05:59.119976 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.709994 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.720350 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.731657 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.731689 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.731701 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.731716 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.731728 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:59Z","lastTransitionTime":"2025-12-08T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.737717 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.747123 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.757244 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.773647 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.785198 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.798542 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.811179 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.823651 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a19a77a66bede932c0b113bf2257987ba53c17493c3db9eee6b2869f005dd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:50Z\\\",\\\"message\\\":\\\"2025-12-08T20:05:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69\\\\n2025-12-08T20:05:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69 to /host/opt/cni/bin/\\\\n2025-12-08T20:05:05Z [verbose] multus-daemon started\\\\n2025-12-08T20:05:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T20:05:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.833505 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.833538 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.833551 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.833568 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.833583 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:59Z","lastTransitionTime":"2025-12-08T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.833663 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:05:59Z is after 2025-08-24T17:21:41Z" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.936740 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.937129 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.937202 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.937306 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:05:59 crc kubenswrapper[4781]: I1208 20:05:59.937376 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:05:59Z","lastTransitionTime":"2025-12-08T20:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.040065 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.040112 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.040124 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.040142 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.040154 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:00Z","lastTransitionTime":"2025-12-08T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.142476 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.142533 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.142550 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.142572 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.142591 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:00Z","lastTransitionTime":"2025-12-08T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.244898 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.244965 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.244977 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.244995 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.245007 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:00Z","lastTransitionTime":"2025-12-08T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.347421 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.347458 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.347466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.347480 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.347489 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:00Z","lastTransitionTime":"2025-12-08T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.450492 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.450533 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.450545 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.450589 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.450601 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:00Z","lastTransitionTime":"2025-12-08T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.552704 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.552740 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.552752 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.552766 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.552777 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:00Z","lastTransitionTime":"2025-12-08T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.590413 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/3.log" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.593077 4781 scope.go:117] "RemoveContainer" containerID="d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2" Dec 08 20:06:00 crc kubenswrapper[4781]: E1208 20:06:00.593199 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.604430 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.617899 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.637721 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:59Z\\\",\\\"message\\\":\\\"6814 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:59.117802 6814 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1208 20:05:59.117947 6814 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 20:05:59.118198 6814 factory.go:656] Stopping watch factory\\\\nI1208 20:05:59.118443 6814 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 20:05:59.119163 6814 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 20:05:59.119346 6814 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1208 20:05:59.119875 6814 ovnkube.go:599] Stopped ovnkube\\\\nI1208 20:05:59.119907 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1208 20:05:59.119976 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.654667 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.654766 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.654786 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.654812 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.654830 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:00Z","lastTransitionTime":"2025-12-08T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.657712 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.669687 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.681000 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.694228 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.703675 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.714003 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.723841 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.735157 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a19a77a66bede932c0b113bf2257987ba53c17493c3db9eee6b2869f005dd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:50Z\\\",\\\"message\\\":\\\"2025-12-08T20:05:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69\\\\n2025-12-08T20:05:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69 to /host/opt/cni/bin/\\\\n2025-12-08T20:05:05Z [verbose] multus-daemon started\\\\n2025-12-08T20:05:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T20:05:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.745733 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.757562 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.757616 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.757632 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.757652 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.757667 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:00Z","lastTransitionTime":"2025-12-08T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.760544 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.772773 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.811719 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5807426-b332-4447-acae-956b3e59f0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595a833e27a2ce019e5a2c06b12cf8c3f36fb8377adbfab0a3ecb22adbe3a2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebffab9e6d8c675362193da6ec529e34f1fe6de539e6835f84e501b6f35138a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72af9e3ecbbd8f93747689fc6479f7c379fa06452e6fcb6f7403eca55afaaa2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.826825 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.836815 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.847350 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:00Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.860416 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.860665 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.860784 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.860909 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.861052 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:00Z","lastTransitionTime":"2025-12-08T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.963102 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.963392 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.963485 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.963582 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:00 crc kubenswrapper[4781]: I1208 20:06:00.963671 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:00Z","lastTransitionTime":"2025-12-08T20:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.066751 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.066862 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.066891 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.066995 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.067023 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:01Z","lastTransitionTime":"2025-12-08T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.124984 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.125049 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:01 crc kubenswrapper[4781]: E1208 20:06:01.125109 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.125143 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.125213 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:01 crc kubenswrapper[4781]: E1208 20:06:01.125328 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:01 crc kubenswrapper[4781]: E1208 20:06:01.125403 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:01 crc kubenswrapper[4781]: E1208 20:06:01.125560 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.170271 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.170339 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.170358 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.170382 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.170403 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:01Z","lastTransitionTime":"2025-12-08T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.272553 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.272594 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.272604 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.272620 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.272630 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:01Z","lastTransitionTime":"2025-12-08T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.374782 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.374824 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.374838 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.374855 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.374866 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:01Z","lastTransitionTime":"2025-12-08T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.477330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.477389 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.477405 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.477426 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.477458 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:01Z","lastTransitionTime":"2025-12-08T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.580212 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.580246 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.580255 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.580267 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.580276 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:01Z","lastTransitionTime":"2025-12-08T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.683017 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.683092 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.683111 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.683138 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.683156 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:01Z","lastTransitionTime":"2025-12-08T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.785858 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.785953 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.785972 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.785996 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.786013 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:01Z","lastTransitionTime":"2025-12-08T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.889254 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.889304 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.889325 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.889348 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.889363 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:01Z","lastTransitionTime":"2025-12-08T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.992466 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.992515 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.992527 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.992546 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:01 crc kubenswrapper[4781]: I1208 20:06:01.992561 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:01Z","lastTransitionTime":"2025-12-08T20:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.095524 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.095576 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.095592 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.095614 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.095719 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:02Z","lastTransitionTime":"2025-12-08T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.198793 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.198843 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.198857 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.198877 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.198891 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:02Z","lastTransitionTime":"2025-12-08T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.302344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.302389 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.302399 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.302416 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.302428 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:02Z","lastTransitionTime":"2025-12-08T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.405887 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.406261 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.406443 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.406615 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.406746 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:02Z","lastTransitionTime":"2025-12-08T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.510235 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.510529 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.510643 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.510772 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.510896 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:02Z","lastTransitionTime":"2025-12-08T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.613242 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.613293 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.613309 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.613327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.613340 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:02Z","lastTransitionTime":"2025-12-08T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.716246 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.716284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.716296 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.716312 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.716323 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:02Z","lastTransitionTime":"2025-12-08T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.818707 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.818778 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.818797 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.818819 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.818836 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:02Z","lastTransitionTime":"2025-12-08T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.920802 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.920861 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.920878 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.920905 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:02 crc kubenswrapper[4781]: I1208 20:06:02.920951 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:02Z","lastTransitionTime":"2025-12-08T20:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.023854 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.023899 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.023914 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.023952 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.023967 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:03Z","lastTransitionTime":"2025-12-08T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.125195 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.125298 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:03 crc kubenswrapper[4781]: E1208 20:06:03.125349 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.125209 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:03 crc kubenswrapper[4781]: E1208 20:06:03.125473 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.125312 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:03 crc kubenswrapper[4781]: E1208 20:06:03.125555 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:03 crc kubenswrapper[4781]: E1208 20:06:03.125608 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.126865 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.126941 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.126961 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.126986 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.127004 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:03Z","lastTransitionTime":"2025-12-08T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.230262 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.230303 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.230314 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.230330 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.230342 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:03Z","lastTransitionTime":"2025-12-08T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.332992 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.333242 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.333382 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.333461 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.333537 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:03Z","lastTransitionTime":"2025-12-08T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.435468 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.435505 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.435518 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.435533 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.435545 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:03Z","lastTransitionTime":"2025-12-08T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.537861 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.537906 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.537940 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.537962 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.537989 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:03Z","lastTransitionTime":"2025-12-08T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.653051 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.653180 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.653209 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.653241 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.653266 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:03Z","lastTransitionTime":"2025-12-08T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.755485 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.755517 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.755533 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.755565 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.755575 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:03Z","lastTransitionTime":"2025-12-08T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.859006 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.859292 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.859309 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.859324 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.859335 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:03Z","lastTransitionTime":"2025-12-08T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.962335 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.962384 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.962396 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.962413 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:03 crc kubenswrapper[4781]: I1208 20:06:03.962423 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:03Z","lastTransitionTime":"2025-12-08T20:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.065814 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.065856 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.065875 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.065893 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.065904 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:04Z","lastTransitionTime":"2025-12-08T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.140472 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.154434 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tm5z7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a19a77a66bede932c0b113bf2257987ba53c17493c3db9eee6b2869f005dd4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:50Z\\\",\\\"message\\\":\\\"2025-12-08T20:05:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69\\\\n2025-12-08T20:05:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5ca749e5-aa1c-495e-b0ea-2a3cbc3cea69 to /host/opt/cni/bin/\\\\n2025-12-08T20:05:05Z [verbose] multus-daemon started\\\\n2025-12-08T20:05:05Z [verbose] Readiness Indicator file check\\\\n2025-12-08T20:05:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c8j4f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tm5z7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.164224 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jphfq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e290f4ba-e014-443f-bbaa-1eeb23a9bd15\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10bb97a6c2ab807d72ad5d37a4c5e8c49864bf650427b9548cd9f1f021142a6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9htfg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jphfq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.167424 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.167455 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.167465 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.167480 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.167488 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:04Z","lastTransitionTime":"2025-12-08T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.178646 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a7658859d2dccb91d97908328f146c5241fae0dedd5b96dc539eb54971f6aff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.190034 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81e53b97135622bab8caab0306f51da440e4414393493ca90bb581b58c0cddfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2p5kn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kr4pr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.199549 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-568jn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a03f1d5-f7f0-4b2b-8ae6-e2e7dd397d93\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5e020494f6f511dcb5cea9d43fc6cae1ef8fefc69b6c9659ece22d7d1958c930\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qphws\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-568jn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.216095 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8304801-0d0d-4205-855d-777341abe76d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\":]:17697\\\\nI1208 20:05:03.128621 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1208 20:05:03.129311 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129361 1 shared_informer.go:313] Waiting for caches to sync for RequestHeaderAuthRequestController\\\\nI1208 20:05:03.129402 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3144300213/tls.crt::/tmp/serving-cert-3144300213/tls.key\\\\\\\"\\\\nI1208 20:05:03.129556 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1208 20:05:03.130125 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1208 20:05:03.130141 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1208 20:05:03.130368 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130378 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1208 20:05:03.130387 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1208 20:05:03.130400 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1208 20:05:03.136289 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.136849 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nI1208 20:05:03.138141 1 reflector.go:368] Caches populated for *v1.ConfigMap from k8s.io/client-go@v0.31.1/tools/cache/reflector.go:243\\\\nF1208 20:05:03.138443 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.239104 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3422763d-e28a-4033-ab0e-8d63ab2a6dca\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8822d5f62f45c6bcb98e2e3f4a13054614022e013374197e6732ff4f4b75f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c3900b2224c874167f45572d90b935a87bda29f8400e67d6ea070fe7b54952e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c026c1b21bf5ef251b06841d5bccdba906911ebd01ffb8a80f9ab87ab6547e84\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.251468 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5807426-b332-4447-acae-956b3e59f0b5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://595a833e27a2ce019e5a2c06b12cf8c3f36fb8377adbfab0a3ecb22adbe3a2c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebffab9e6d8c675362193da6ec529e34f1fe6de539e6835f84e501b6f35138a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72af9e3ecbbd8f93747689fc6479f7c379fa06452e6fcb6f7403eca55afaaa2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41e357dfb39f7bc9b5ab97af1b51bb693c20b8d8e5a6d10786e7ce47abc9e810\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.267432 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3526d83-eb7e-486e-9357-80df536d09fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-08T20:05:59Z\\\",\\\"message\\\":\\\"6814 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1208 20:05:59.117802 6814 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1208 20:05:59.117947 6814 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 20:05:59.118198 6814 factory.go:656] Stopping watch factory\\\\nI1208 20:05:59.118443 6814 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 20:05:59.119163 6814 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1208 20:05:59.119346 6814 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1208 20:05:59.119875 6814 ovnkube.go:599] Stopped ovnkube\\\\nI1208 20:05:59.119907 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1208 20:05:59.119976 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9dbq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-67t9k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.269596 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.269751 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.269826 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.269909 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.270031 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:04Z","lastTransitionTime":"2025-12-08T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.282360 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17c9119b616104f153daf5d383c1ff355ab1369d3e50d1ae0ced9069641afc4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.294048 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0629155d9c272cae42f2102a3364863101486350e7541a6fb8a7709ea450280a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fdc6d86395e9840c1eb254cf482b9a774f7c41721d6fa962c72ffc21aa7dc8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.306771 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6a36d76-3939-4b05-a4b4-c97c3c03aaf9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da74bd2334d9f5e7c57d5f8be6e3be2c9be063c75ca7740a72b887873808367e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3630e04151b5f02575b0d27cdbf414ce8b6d2ad04f90e0d286b5248025eb42bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c3e5c6ee76a6be86aefdd49f82c439b4e19b1b4935b14105039d8906025c7f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ef1de31de8b67ee0f983c8e01305dafaa8299b21df23cc512dc77fc30563151\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e18606bb79b83e2708f14bb00d769eb771e94bf4dc2e8acd45217171a510f09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e650ed090bcd7309f28880d9e531bc7d63117c963dc82e3dab4b2fdd8c366be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://924411c09fe4c428d9317e7741250b1600b9352b8fd5c2a944897d32419d40aa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:05:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nmtcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-zqc9l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.315619 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c74e396c-5b68-47be-b86b-9f48c02ec760\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmskv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:03Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-gr5xw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.326229 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35f5c146-0078-4ac0-9b34-b5ae446e4e35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://054fb26a82b2be1189e0f59942714635f480e3ad8c270109eea561c06e5ffc24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f06e07e48ca576695e267bede8e99ce49f1c6e4d16409848c3176c2f8a3d8743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:05:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vljx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-vbcgq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.353505 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1fe9af4-48d7-41f2-bac1-8ea74516b939\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-08T20:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62f579050fea6aa095f3de4bdc614f75cdf50a062604c93eda00e4ac7a26c9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d3510a8b3b776881385cc78b4cffc988243f3af6afcfc9b928d85ce6d2450d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cd4a7165ef29c47b5588f5a6d556655743e33840dcbfc79429e7c8aa027ef50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://890f7421ceb8a5469f60d920755aa70b573c090fa2cb5e9d9a380886ce102500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://244e6f1f3798c59f4dcc20438e7b876835f82f5d98c5c913b7ff66bd97c5b62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-08T20:04:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b136f181d92ea625d4a997c6a670cda77244f397744ac75a31a50548d49c9139\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaf895bbfe62ceae156ccc8ad65b3e56582485edf9074a7c67f045883565da20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:46Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c8583e1e24d404340d5c277068ecc76289f793df3d202a61e4cb915de8b1b6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-08T20:04:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-08T20:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-08T20:04:44Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.369194 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.372988 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.373159 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.373397 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.373486 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.373565 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:04Z","lastTransitionTime":"2025-12-08T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.383598 4781 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-08T20:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-08T20:06:04Z is after 2025-08-24T17:21:41Z" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.476629 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.476934 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.477061 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.477224 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.477383 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:04Z","lastTransitionTime":"2025-12-08T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.579399 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.579454 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.579473 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.579495 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.579511 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:04Z","lastTransitionTime":"2025-12-08T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.681989 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.682026 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.682036 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.682049 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.682076 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:04Z","lastTransitionTime":"2025-12-08T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.784619 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.785227 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.785320 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.785429 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.785495 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:04Z","lastTransitionTime":"2025-12-08T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.888703 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.888760 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.888778 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.888801 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.888817 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:04Z","lastTransitionTime":"2025-12-08T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.992284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.992327 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.992340 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.992357 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:04 crc kubenswrapper[4781]: I1208 20:06:04.992371 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:04Z","lastTransitionTime":"2025-12-08T20:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.094976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.095284 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.095436 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.095579 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.095693 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:05Z","lastTransitionTime":"2025-12-08T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.125527 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.125533 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.125581 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:05 crc kubenswrapper[4781]: E1208 20:06:05.126284 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:05 crc kubenswrapper[4781]: E1208 20:06:05.126355 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.125587 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:05 crc kubenswrapper[4781]: E1208 20:06:05.126455 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:05 crc kubenswrapper[4781]: E1208 20:06:05.126727 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.198554 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.198612 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.198631 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.198656 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.198673 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:05Z","lastTransitionTime":"2025-12-08T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.300912 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.301156 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.301238 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.301314 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.301388 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:05Z","lastTransitionTime":"2025-12-08T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.404819 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.404879 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.404950 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.404990 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.405014 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:05Z","lastTransitionTime":"2025-12-08T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.508465 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.508534 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.508551 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.508573 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.508590 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:05Z","lastTransitionTime":"2025-12-08T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.613032 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.613105 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.613127 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.613155 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.613187 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:05Z","lastTransitionTime":"2025-12-08T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.715711 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.715754 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.715766 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.715783 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.715796 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:05Z","lastTransitionTime":"2025-12-08T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.817836 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.817903 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.817949 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.817979 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.818040 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:05Z","lastTransitionTime":"2025-12-08T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.921062 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.921397 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.921621 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.921792 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:05 crc kubenswrapper[4781]: I1208 20:06:05.921971 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:05Z","lastTransitionTime":"2025-12-08T20:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.024769 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.024813 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.024824 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.024837 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.024846 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:06Z","lastTransitionTime":"2025-12-08T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.127320 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.127373 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.127399 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.127418 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.127441 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:06Z","lastTransitionTime":"2025-12-08T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.229897 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.229975 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.229992 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.230017 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.230029 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:06Z","lastTransitionTime":"2025-12-08T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.332980 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.333246 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.333331 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.333415 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.333495 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:06Z","lastTransitionTime":"2025-12-08T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.436832 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.436895 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.436930 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.436951 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.436971 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:06Z","lastTransitionTime":"2025-12-08T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.539709 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.539770 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.539785 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.539806 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.539821 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:06Z","lastTransitionTime":"2025-12-08T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.642443 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.642485 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.642494 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.642551 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.642564 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:06Z","lastTransitionTime":"2025-12-08T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.744781 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.744830 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.744840 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.744857 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.744871 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:06Z","lastTransitionTime":"2025-12-08T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.847834 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.847887 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.847902 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.847943 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.847959 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:06Z","lastTransitionTime":"2025-12-08T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.950625 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.951028 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.951208 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.951375 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:06 crc kubenswrapper[4781]: I1208 20:06:06.951552 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:06Z","lastTransitionTime":"2025-12-08T20:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.016717 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.017102 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.017071697 +0000 UTC m=+147.168355084 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.055307 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.055553 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.055626 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.055701 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.055758 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:07Z","lastTransitionTime":"2025-12-08T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.118040 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.118124 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.118183 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.118287 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.118311 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.118393 4781 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.118419 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.118450 4781 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.118507 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.118470392 +0000 UTC m=+147.269753809 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.118545 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.118526623 +0000 UTC m=+147.269810050 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.118657 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.118683 4781 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.118702 4781 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.118772 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.118740979 +0000 UTC m=+147.270024416 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.118908 4781 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.119000 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.118980066 +0000 UTC m=+147.270263573 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.125392 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.125569 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.125642 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.125711 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.126060 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.126631 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.126671 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.126774 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.158402 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.158447 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.158463 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.158485 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.158501 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:07Z","lastTransitionTime":"2025-12-08T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.262037 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.262098 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.262120 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.262151 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.262172 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:07Z","lastTransitionTime":"2025-12-08T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.365258 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.365323 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.365348 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.365381 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.365407 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:07Z","lastTransitionTime":"2025-12-08T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.468864 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.468911 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.468944 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.468965 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.468980 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:07Z","lastTransitionTime":"2025-12-08T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.572617 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.572646 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.572653 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.572666 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.572674 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:07Z","lastTransitionTime":"2025-12-08T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.675974 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.676017 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.676029 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.676045 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.676057 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:07Z","lastTransitionTime":"2025-12-08T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.725104 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.725356 4781 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:06:07 crc kubenswrapper[4781]: E1208 20:06:07.725467 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs podName:c74e396c-5b68-47be-b86b-9f48c02ec760 nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.72543612 +0000 UTC m=+147.876719547 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs") pod "network-metrics-daemon-gr5xw" (UID: "c74e396c-5b68-47be-b86b-9f48c02ec760") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.779051 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.779086 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.779094 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.779107 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.779117 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:07Z","lastTransitionTime":"2025-12-08T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.881417 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.881450 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.881459 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.881474 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.881487 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:07Z","lastTransitionTime":"2025-12-08T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.983379 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.983434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.983446 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.983471 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:07 crc kubenswrapper[4781]: I1208 20:06:07.983485 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:07Z","lastTransitionTime":"2025-12-08T20:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.085905 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.085952 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.085960 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.085976 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.085989 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:08Z","lastTransitionTime":"2025-12-08T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.187824 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.187868 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.187883 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.187899 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.187932 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:08Z","lastTransitionTime":"2025-12-08T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.290671 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.290713 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.290743 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.290763 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.290779 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:08Z","lastTransitionTime":"2025-12-08T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.392684 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.392725 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.392742 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.392758 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.392768 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:08Z","lastTransitionTime":"2025-12-08T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.495594 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.495702 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.495720 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.495743 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.495760 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:08Z","lastTransitionTime":"2025-12-08T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.598373 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.598413 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.598421 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.598434 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.598443 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:08Z","lastTransitionTime":"2025-12-08T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.701161 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.701217 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.701233 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.701254 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.701268 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:08Z","lastTransitionTime":"2025-12-08T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.803299 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.803336 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.803344 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.803373 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.803383 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:08Z","lastTransitionTime":"2025-12-08T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.906117 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.906234 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.906252 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.906277 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.906295 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:08Z","lastTransitionTime":"2025-12-08T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.983459 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.983519 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.983530 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.983545 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 08 20:06:08 crc kubenswrapper[4781]: I1208 20:06:08.983556 4781 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-08T20:06:08Z","lastTransitionTime":"2025-12-08T20:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.046783 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt"] Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.047572 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.052290 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.052541 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.052676 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.052841 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.086949 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tm5z7" podStartSLOduration=66.086931141 podStartE2EDuration="1m6.086931141s" podCreationTimestamp="2025-12-08 20:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:06:09.07375943 +0000 UTC m=+85.225042817" watchObservedRunningTime="2025-12-08 20:06:09.086931141 +0000 UTC m=+85.238214518" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.101070 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jphfq" podStartSLOduration=65.101051738 podStartE2EDuration="1m5.101051738s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:06:09.087120746 +0000 UTC m=+85.238404153" watchObservedRunningTime="2025-12-08 20:06:09.101051738 +0000 UTC m=+85.252335115" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.132578 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.132792 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:09 crc kubenswrapper[4781]: E1208 20:06:09.132912 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.132820 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:09 crc kubenswrapper[4781]: E1208 20:06:09.133046 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.132792 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:09 crc kubenswrapper[4781]: E1208 20:06:09.133135 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:09 crc kubenswrapper[4781]: E1208 20:06:09.133194 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.137195 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=66.137175685 podStartE2EDuration="1m6.137175685s" podCreationTimestamp="2025-12-08 20:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:06:09.13699286 +0000 UTC m=+85.288276257" watchObservedRunningTime="2025-12-08 20:06:09.137175685 +0000 UTC m=+85.288459062" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.140669 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.140725 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.140768 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.140812 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.140881 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.172672 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.172655004 podStartE2EDuration="38.172655004s" podCreationTimestamp="2025-12-08 20:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:06:09.15546527 +0000 UTC m=+85.306748697" watchObservedRunningTime="2025-12-08 20:06:09.172655004 +0000 UTC m=+85.323938381" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.187087 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podStartSLOduration=66.18706759 podStartE2EDuration="1m6.18706759s" podCreationTimestamp="2025-12-08 20:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:06:09.186705 +0000 UTC m=+85.337988367" watchObservedRunningTime="2025-12-08 20:06:09.18706759 +0000 UTC m=+85.338350977" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.215779 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-568jn" podStartSLOduration=66.215757898 podStartE2EDuration="1m6.215757898s" podCreationTimestamp="2025-12-08 20:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:06:09.198651706 +0000 UTC m=+85.349935103" watchObservedRunningTime="2025-12-08 20:06:09.215757898 +0000 UTC m=+85.367041275" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.227935 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=66.22790352 podStartE2EDuration="1m6.22790352s" podCreationTimestamp="2025-12-08 20:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:06:09.215933133 +0000 UTC m=+85.367216510" watchObservedRunningTime="2025-12-08 20:06:09.22790352 +0000 UTC m=+85.379186897" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.242373 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.242439 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.242479 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.242514 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.242539 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.242620 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.242708 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.245026 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-service-ca\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.251650 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.261383 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e18ed3b9-23e5-486c-b3ed-c39e0fa1d886-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-cbpvt\" (UID: \"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.318142 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zqc9l" podStartSLOduration=66.31811856 podStartE2EDuration="1m6.31811856s" podCreationTimestamp="2025-12-08 20:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:06:09.317728589 +0000 UTC m=+85.469011976" watchObservedRunningTime="2025-12-08 20:06:09.31811856 +0000 UTC m=+85.469401937" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.344117 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-vbcgq" podStartSLOduration=65.343986288 podStartE2EDuration="1m5.343986288s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:06:09.343519165 +0000 UTC m=+85.494802552" watchObservedRunningTime="2025-12-08 20:06:09.343986288 +0000 UTC m=+85.495269665" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.365640 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.381966 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=63.381949087 podStartE2EDuration="1m3.381949087s" podCreationTimestamp="2025-12-08 20:05:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:06:09.37888143 +0000 UTC m=+85.530164827" watchObservedRunningTime="2025-12-08 20:06:09.381949087 +0000 UTC m=+85.533232464" Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.620845 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" event={"ID":"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886","Type":"ContainerStarted","Data":"95f5a32e89a8377582343c3389e1110866919a4f379c089512131a2ea4bb7613"} Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.621097 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" event={"ID":"e18ed3b9-23e5-486c-b3ed-c39e0fa1d886","Type":"ContainerStarted","Data":"dcd881ffc2abf84624165c46958fb7824722976d806c8beb205bce8433e99b41"} Dec 08 20:06:09 crc kubenswrapper[4781]: I1208 20:06:09.633739 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-cbpvt" podStartSLOduration=65.633720845 podStartE2EDuration="1m5.633720845s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:06:09.633268242 +0000 UTC m=+85.784551629" watchObservedRunningTime="2025-12-08 20:06:09.633720845 +0000 UTC m=+85.785004222" Dec 08 20:06:11 crc kubenswrapper[4781]: I1208 20:06:11.124884 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:11 crc kubenswrapper[4781]: I1208 20:06:11.124891 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:11 crc kubenswrapper[4781]: I1208 20:06:11.126166 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:11 crc kubenswrapper[4781]: I1208 20:06:11.126383 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:11 crc kubenswrapper[4781]: E1208 20:06:11.130547 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:11 crc kubenswrapper[4781]: E1208 20:06:11.130829 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:11 crc kubenswrapper[4781]: E1208 20:06:11.130824 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:11 crc kubenswrapper[4781]: E1208 20:06:11.130871 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:11 crc kubenswrapper[4781]: I1208 20:06:11.145154 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 08 20:06:13 crc kubenswrapper[4781]: I1208 20:06:13.125740 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:13 crc kubenswrapper[4781]: I1208 20:06:13.125846 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:13 crc kubenswrapper[4781]: I1208 20:06:13.125845 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:13 crc kubenswrapper[4781]: I1208 20:06:13.125902 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:13 crc kubenswrapper[4781]: E1208 20:06:13.126029 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:13 crc kubenswrapper[4781]: E1208 20:06:13.126167 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:13 crc kubenswrapper[4781]: E1208 20:06:13.126479 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:13 crc kubenswrapper[4781]: E1208 20:06:13.126737 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:14 crc kubenswrapper[4781]: I1208 20:06:14.141596 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=3.141572874 podStartE2EDuration="3.141572874s" podCreationTimestamp="2025-12-08 20:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:06:14.140468223 +0000 UTC m=+90.291751630" watchObservedRunningTime="2025-12-08 20:06:14.141572874 +0000 UTC m=+90.292856281" Dec 08 20:06:15 crc kubenswrapper[4781]: I1208 20:06:15.125281 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:15 crc kubenswrapper[4781]: I1208 20:06:15.125633 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:15 crc kubenswrapper[4781]: I1208 20:06:15.125632 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:15 crc kubenswrapper[4781]: I1208 20:06:15.125325 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:15 crc kubenswrapper[4781]: I1208 20:06:15.125834 4781 scope.go:117] "RemoveContainer" containerID="d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2" Dec 08 20:06:15 crc kubenswrapper[4781]: E1208 20:06:15.125904 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:15 crc kubenswrapper[4781]: E1208 20:06:15.126019 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:15 crc kubenswrapper[4781]: E1208 20:06:15.126104 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:15 crc kubenswrapper[4781]: E1208 20:06:15.126113 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" Dec 08 20:06:15 crc kubenswrapper[4781]: E1208 20:06:15.126700 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:17 crc kubenswrapper[4781]: I1208 20:06:17.125876 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:17 crc kubenswrapper[4781]: E1208 20:06:17.126731 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:17 crc kubenswrapper[4781]: I1208 20:06:17.125961 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:17 crc kubenswrapper[4781]: I1208 20:06:17.126119 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:17 crc kubenswrapper[4781]: I1208 20:06:17.125898 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:17 crc kubenswrapper[4781]: E1208 20:06:17.127265 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:17 crc kubenswrapper[4781]: E1208 20:06:17.127465 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:17 crc kubenswrapper[4781]: E1208 20:06:17.127502 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:19 crc kubenswrapper[4781]: I1208 20:06:19.126192 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:19 crc kubenswrapper[4781]: I1208 20:06:19.126229 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:19 crc kubenswrapper[4781]: I1208 20:06:19.126237 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:19 crc kubenswrapper[4781]: I1208 20:06:19.126191 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:19 crc kubenswrapper[4781]: E1208 20:06:19.126366 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:19 crc kubenswrapper[4781]: E1208 20:06:19.126435 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:19 crc kubenswrapper[4781]: E1208 20:06:19.126618 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:19 crc kubenswrapper[4781]: E1208 20:06:19.126813 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:21 crc kubenswrapper[4781]: I1208 20:06:21.124896 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:21 crc kubenswrapper[4781]: I1208 20:06:21.125082 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:21 crc kubenswrapper[4781]: E1208 20:06:21.125320 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:21 crc kubenswrapper[4781]: I1208 20:06:21.124875 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:21 crc kubenswrapper[4781]: E1208 20:06:21.125637 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:21 crc kubenswrapper[4781]: I1208 20:06:21.125962 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:21 crc kubenswrapper[4781]: E1208 20:06:21.126044 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:21 crc kubenswrapper[4781]: E1208 20:06:21.126169 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:23 crc kubenswrapper[4781]: I1208 20:06:23.125512 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:23 crc kubenswrapper[4781]: E1208 20:06:23.125641 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:23 crc kubenswrapper[4781]: I1208 20:06:23.125675 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:23 crc kubenswrapper[4781]: I1208 20:06:23.125751 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:23 crc kubenswrapper[4781]: E1208 20:06:23.125846 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:23 crc kubenswrapper[4781]: E1208 20:06:23.126070 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:23 crc kubenswrapper[4781]: I1208 20:06:23.126090 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:23 crc kubenswrapper[4781]: E1208 20:06:23.126300 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:25 crc kubenswrapper[4781]: I1208 20:06:25.125629 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:25 crc kubenswrapper[4781]: I1208 20:06:25.125670 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:25 crc kubenswrapper[4781]: I1208 20:06:25.125731 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:25 crc kubenswrapper[4781]: I1208 20:06:25.125751 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:25 crc kubenswrapper[4781]: E1208 20:06:25.125873 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:25 crc kubenswrapper[4781]: E1208 20:06:25.125998 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:25 crc kubenswrapper[4781]: E1208 20:06:25.126125 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:25 crc kubenswrapper[4781]: E1208 20:06:25.126268 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:27 crc kubenswrapper[4781]: I1208 20:06:27.125588 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:27 crc kubenswrapper[4781]: I1208 20:06:27.125675 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:27 crc kubenswrapper[4781]: E1208 20:06:27.125701 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:27 crc kubenswrapper[4781]: I1208 20:06:27.125764 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:27 crc kubenswrapper[4781]: I1208 20:06:27.125798 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:27 crc kubenswrapper[4781]: E1208 20:06:27.126056 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:27 crc kubenswrapper[4781]: E1208 20:06:27.126164 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:27 crc kubenswrapper[4781]: E1208 20:06:27.126400 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:28 crc kubenswrapper[4781]: I1208 20:06:28.125782 4781 scope.go:117] "RemoveContainer" containerID="d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2" Dec 08 20:06:28 crc kubenswrapper[4781]: E1208 20:06:28.126435 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-67t9k_openshift-ovn-kubernetes(a3526d83-eb7e-486e-9357-80df536d09fd)\"" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" Dec 08 20:06:29 crc kubenswrapper[4781]: I1208 20:06:29.125012 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:29 crc kubenswrapper[4781]: I1208 20:06:29.125058 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:29 crc kubenswrapper[4781]: E1208 20:06:29.125207 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:29 crc kubenswrapper[4781]: I1208 20:06:29.125473 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:29 crc kubenswrapper[4781]: E1208 20:06:29.125573 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:29 crc kubenswrapper[4781]: I1208 20:06:29.125709 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:29 crc kubenswrapper[4781]: E1208 20:06:29.125864 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:29 crc kubenswrapper[4781]: E1208 20:06:29.126063 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:31 crc kubenswrapper[4781]: I1208 20:06:31.124965 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:31 crc kubenswrapper[4781]: I1208 20:06:31.125072 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:31 crc kubenswrapper[4781]: I1208 20:06:31.125107 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:31 crc kubenswrapper[4781]: I1208 20:06:31.125125 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:31 crc kubenswrapper[4781]: E1208 20:06:31.125246 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:31 crc kubenswrapper[4781]: E1208 20:06:31.125352 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:31 crc kubenswrapper[4781]: E1208 20:06:31.125407 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:31 crc kubenswrapper[4781]: E1208 20:06:31.125471 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:33 crc kubenswrapper[4781]: I1208 20:06:33.125086 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:33 crc kubenswrapper[4781]: I1208 20:06:33.125136 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:33 crc kubenswrapper[4781]: I1208 20:06:33.125166 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:33 crc kubenswrapper[4781]: I1208 20:06:33.125104 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:33 crc kubenswrapper[4781]: E1208 20:06:33.125278 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:33 crc kubenswrapper[4781]: E1208 20:06:33.125438 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:33 crc kubenswrapper[4781]: E1208 20:06:33.125550 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:33 crc kubenswrapper[4781]: E1208 20:06:33.125653 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:35 crc kubenswrapper[4781]: I1208 20:06:35.125769 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:35 crc kubenswrapper[4781]: E1208 20:06:35.126851 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:35 crc kubenswrapper[4781]: I1208 20:06:35.125842 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:35 crc kubenswrapper[4781]: I1208 20:06:35.125806 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:35 crc kubenswrapper[4781]: E1208 20:06:35.127415 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:35 crc kubenswrapper[4781]: I1208 20:06:35.125864 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:35 crc kubenswrapper[4781]: E1208 20:06:35.127840 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:35 crc kubenswrapper[4781]: E1208 20:06:35.127733 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:36 crc kubenswrapper[4781]: I1208 20:06:36.705057 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tm5z7_a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20/kube-multus/1.log" Dec 08 20:06:36 crc kubenswrapper[4781]: I1208 20:06:36.705597 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tm5z7_a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20/kube-multus/0.log" Dec 08 20:06:36 crc kubenswrapper[4781]: I1208 20:06:36.705634 4781 generic.go:334] "Generic (PLEG): container finished" podID="a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20" containerID="0a19a77a66bede932c0b113bf2257987ba53c17493c3db9eee6b2869f005dd4a" exitCode=1 Dec 08 20:06:36 crc kubenswrapper[4781]: I1208 20:06:36.705683 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tm5z7" event={"ID":"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20","Type":"ContainerDied","Data":"0a19a77a66bede932c0b113bf2257987ba53c17493c3db9eee6b2869f005dd4a"} Dec 08 20:06:36 crc kubenswrapper[4781]: I1208 20:06:36.705757 4781 scope.go:117] "RemoveContainer" containerID="4c23a03ee9d96c90baf894543f673f47f8f4f61126e874fa6e9ebd1a5913ebd8" Dec 08 20:06:36 crc kubenswrapper[4781]: I1208 20:06:36.706467 4781 scope.go:117] "RemoveContainer" containerID="0a19a77a66bede932c0b113bf2257987ba53c17493c3db9eee6b2869f005dd4a" Dec 08 20:06:36 crc kubenswrapper[4781]: E1208 20:06:36.707013 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-tm5z7_openshift-multus(a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20)\"" pod="openshift-multus/multus-tm5z7" podUID="a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20" Dec 08 20:06:37 crc kubenswrapper[4781]: I1208 20:06:37.125144 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:37 crc kubenswrapper[4781]: I1208 20:06:37.125245 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:37 crc kubenswrapper[4781]: E1208 20:06:37.125380 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:37 crc kubenswrapper[4781]: I1208 20:06:37.125426 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:37 crc kubenswrapper[4781]: I1208 20:06:37.125424 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:37 crc kubenswrapper[4781]: E1208 20:06:37.125517 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:37 crc kubenswrapper[4781]: E1208 20:06:37.125705 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:37 crc kubenswrapper[4781]: E1208 20:06:37.125844 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:37 crc kubenswrapper[4781]: I1208 20:06:37.712843 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tm5z7_a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20/kube-multus/1.log" Dec 08 20:06:39 crc kubenswrapper[4781]: I1208 20:06:39.125739 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:39 crc kubenswrapper[4781]: I1208 20:06:39.125864 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:39 crc kubenswrapper[4781]: I1208 20:06:39.126199 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:39 crc kubenswrapper[4781]: I1208 20:06:39.126236 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:39 crc kubenswrapper[4781]: E1208 20:06:39.126312 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:39 crc kubenswrapper[4781]: E1208 20:06:39.126551 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:39 crc kubenswrapper[4781]: E1208 20:06:39.126691 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:39 crc kubenswrapper[4781]: E1208 20:06:39.126869 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:41 crc kubenswrapper[4781]: I1208 20:06:41.125473 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:41 crc kubenswrapper[4781]: E1208 20:06:41.125661 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:41 crc kubenswrapper[4781]: I1208 20:06:41.125959 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:41 crc kubenswrapper[4781]: E1208 20:06:41.126030 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:41 crc kubenswrapper[4781]: I1208 20:06:41.126163 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:41 crc kubenswrapper[4781]: E1208 20:06:41.126258 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:41 crc kubenswrapper[4781]: I1208 20:06:41.126476 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:41 crc kubenswrapper[4781]: E1208 20:06:41.126550 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:41 crc kubenswrapper[4781]: I1208 20:06:41.127719 4781 scope.go:117] "RemoveContainer" containerID="d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2" Dec 08 20:06:41 crc kubenswrapper[4781]: I1208 20:06:41.727304 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/3.log" Dec 08 20:06:41 crc kubenswrapper[4781]: I1208 20:06:41.729556 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerStarted","Data":"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44"} Dec 08 20:06:41 crc kubenswrapper[4781]: I1208 20:06:41.729884 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:06:41 crc kubenswrapper[4781]: I1208 20:06:41.848591 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podStartSLOduration=98.848572077 podStartE2EDuration="1m38.848572077s" podCreationTimestamp="2025-12-08 20:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:06:41.753775689 +0000 UTC m=+117.905059066" watchObservedRunningTime="2025-12-08 20:06:41.848572077 +0000 UTC m=+117.999855454" Dec 08 20:06:41 crc kubenswrapper[4781]: I1208 20:06:41.849405 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gr5xw"] Dec 08 20:06:41 crc kubenswrapper[4781]: I1208 20:06:41.849524 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:41 crc kubenswrapper[4781]: E1208 20:06:41.849636 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:43 crc kubenswrapper[4781]: I1208 20:06:43.125153 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:43 crc kubenswrapper[4781]: I1208 20:06:43.125200 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:43 crc kubenswrapper[4781]: E1208 20:06:43.125254 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:43 crc kubenswrapper[4781]: I1208 20:06:43.125206 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:43 crc kubenswrapper[4781]: E1208 20:06:43.125401 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:43 crc kubenswrapper[4781]: E1208 20:06:43.125465 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:44 crc kubenswrapper[4781]: I1208 20:06:44.125226 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:44 crc kubenswrapper[4781]: E1208 20:06:44.126193 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:44 crc kubenswrapper[4781]: E1208 20:06:44.147884 4781 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 08 20:06:44 crc kubenswrapper[4781]: E1208 20:06:44.259158 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 20:06:45 crc kubenswrapper[4781]: I1208 20:06:45.125848 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:45 crc kubenswrapper[4781]: I1208 20:06:45.125897 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:45 crc kubenswrapper[4781]: E1208 20:06:45.126451 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:45 crc kubenswrapper[4781]: I1208 20:06:45.125997 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:45 crc kubenswrapper[4781]: E1208 20:06:45.126584 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:45 crc kubenswrapper[4781]: E1208 20:06:45.126775 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:46 crc kubenswrapper[4781]: I1208 20:06:46.125424 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:46 crc kubenswrapper[4781]: E1208 20:06:46.125569 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:47 crc kubenswrapper[4781]: I1208 20:06:47.125353 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:47 crc kubenswrapper[4781]: I1208 20:06:47.125392 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:47 crc kubenswrapper[4781]: I1208 20:06:47.125375 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:47 crc kubenswrapper[4781]: E1208 20:06:47.125564 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:47 crc kubenswrapper[4781]: E1208 20:06:47.125715 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:47 crc kubenswrapper[4781]: E1208 20:06:47.125828 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:48 crc kubenswrapper[4781]: I1208 20:06:48.125843 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:48 crc kubenswrapper[4781]: E1208 20:06:48.127202 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:49 crc kubenswrapper[4781]: I1208 20:06:49.125431 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:49 crc kubenswrapper[4781]: I1208 20:06:49.125477 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:49 crc kubenswrapper[4781]: I1208 20:06:49.125834 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:49 crc kubenswrapper[4781]: E1208 20:06:49.126021 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:49 crc kubenswrapper[4781]: E1208 20:06:49.126441 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:49 crc kubenswrapper[4781]: E1208 20:06:49.126596 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:49 crc kubenswrapper[4781]: E1208 20:06:49.260463 4781 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 08 20:06:50 crc kubenswrapper[4781]: I1208 20:06:50.125162 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:50 crc kubenswrapper[4781]: E1208 20:06:50.125310 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:50 crc kubenswrapper[4781]: I1208 20:06:50.125822 4781 scope.go:117] "RemoveContainer" containerID="0a19a77a66bede932c0b113bf2257987ba53c17493c3db9eee6b2869f005dd4a" Dec 08 20:06:50 crc kubenswrapper[4781]: I1208 20:06:50.770122 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tm5z7_a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20/kube-multus/1.log" Dec 08 20:06:50 crc kubenswrapper[4781]: I1208 20:06:50.770500 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tm5z7" event={"ID":"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20","Type":"ContainerStarted","Data":"aa1a90ca80410aee290e00aa4bec32980f22fe04e8d7dcac754b42a8fa098950"} Dec 08 20:06:51 crc kubenswrapper[4781]: I1208 20:06:51.124825 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:51 crc kubenswrapper[4781]: E1208 20:06:51.125070 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:51 crc kubenswrapper[4781]: I1208 20:06:51.125147 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:51 crc kubenswrapper[4781]: I1208 20:06:51.125210 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:51 crc kubenswrapper[4781]: E1208 20:06:51.125338 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:51 crc kubenswrapper[4781]: E1208 20:06:51.125487 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:52 crc kubenswrapper[4781]: I1208 20:06:52.124958 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:52 crc kubenswrapper[4781]: E1208 20:06:52.125127 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:53 crc kubenswrapper[4781]: I1208 20:06:53.125024 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:53 crc kubenswrapper[4781]: I1208 20:06:53.125075 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:53 crc kubenswrapper[4781]: I1208 20:06:53.125165 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:53 crc kubenswrapper[4781]: E1208 20:06:53.125204 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 08 20:06:53 crc kubenswrapper[4781]: E1208 20:06:53.125273 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 08 20:06:53 crc kubenswrapper[4781]: E1208 20:06:53.125405 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 08 20:06:54 crc kubenswrapper[4781]: I1208 20:06:54.125794 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:54 crc kubenswrapper[4781]: E1208 20:06:54.126876 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gr5xw" podUID="c74e396c-5b68-47be-b86b-9f48c02ec760" Dec 08 20:06:55 crc kubenswrapper[4781]: I1208 20:06:55.125825 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:06:55 crc kubenswrapper[4781]: I1208 20:06:55.125863 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:06:55 crc kubenswrapper[4781]: I1208 20:06:55.125824 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:06:55 crc kubenswrapper[4781]: I1208 20:06:55.127938 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 08 20:06:55 crc kubenswrapper[4781]: I1208 20:06:55.128702 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 08 20:06:55 crc kubenswrapper[4781]: I1208 20:06:55.128728 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 08 20:06:55 crc kubenswrapper[4781]: I1208 20:06:55.128848 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 08 20:06:56 crc kubenswrapper[4781]: I1208 20:06:56.125346 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:06:56 crc kubenswrapper[4781]: I1208 20:06:56.127436 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 08 20:06:56 crc kubenswrapper[4781]: I1208 20:06:56.127499 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.336398 4781 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.379491 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hjncp"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.379943 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.380720 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.381241 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.381611 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gdrf7"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.381894 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.382373 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.383303 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.383543 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.384058 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.384428 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.385212 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.385426 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.386225 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.386896 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.388700 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q2qbx"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.389212 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.389471 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.391070 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lx5mt"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.391670 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.392706 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.393570 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.394227 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-kf259"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.394677 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.395350 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.396534 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.397356 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.399251 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.399393 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dfxlz"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.399904 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dfxlz" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.400339 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.400814 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.401799 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-tmmk6"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.402293 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.403157 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.403285 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.403439 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.403661 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.403715 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.404087 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.404486 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.405324 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.406847 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vq67l"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.408018 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-546nt"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.408626 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dh9vd"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.409115 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.409360 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.409787 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.410479 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-586sh"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.410540 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.424287 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.426559 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.426598 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.426672 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.434684 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.434836 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.434964 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.435090 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.435147 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.435268 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.435369 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.435401 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.435712 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.435908 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.440262 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.440421 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.440863 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.441151 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.441285 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.441415 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.443043 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.445403 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.446641 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.447006 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.447189 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.447824 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.447876 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.447953 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.447969 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448045 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448077 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.447888 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448162 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448249 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448254 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448289 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448331 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448347 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448370 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448406 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448415 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448480 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448516 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448535 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448552 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448616 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448629 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448672 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448702 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448708 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448752 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448790 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448797 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448797 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448909 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448952 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.448929 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.449192 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.449442 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.449842 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.450536 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.455297 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.457165 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.457323 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.457506 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.457602 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.458094 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.458140 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.462305 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.462726 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.464201 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.464680 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.464993 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jmklp"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.465759 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jmklp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.465824 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-k9nwr"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.466286 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-k9nwr" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.469285 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.469462 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.469609 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.469764 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483010 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d53e0cc-045b-4cb2-a5d3-be89b73f98e5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pbxvs\" (UID: \"8d53e0cc-045b-4cb2-a5d3-be89b73f98e5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483051 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513ea4da-405c-4176-adbd-c8e5f68c631c-config\") pod \"machine-api-operator-5694c8668f-lx5mt\" (UID: \"513ea4da-405c-4176-adbd-c8e5f68c631c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483075 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483092 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0b17f17-89ff-4f30-a316-6b6a17d8b7a2-trusted-ca\") pod \"ingress-operator-5b745b69d9-qtz49\" (UID: \"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483108 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483127 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-config\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483149 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz9px\" (UniqueName: \"kubernetes.io/projected/a0818ea3-b629-47a3-8edb-d77e60a23068-kube-api-access-kz9px\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483191 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mdwc7\" (UID: \"b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483224 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l7kb\" (UniqueName: \"kubernetes.io/projected/80f07ecd-4175-4ed0-b11e-507b0b7a783f-kube-api-access-7l7kb\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483256 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483261 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483454 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483498 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f573c3d-716d-4f05-9308-6983d8c30570-config\") pod \"console-operator-58897d9998-gdrf7\" (UID: \"0f573c3d-716d-4f05-9308-6983d8c30570\") " pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483530 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e761a9a6-bbf9-4bc8-9c36-358522654b25-etcd-client\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483557 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzz2c\" (UniqueName: \"kubernetes.io/projected/152c0bd2-ed47-4f77-851d-562166d3bc1f-kube-api-access-pzz2c\") pod \"openshift-apiserver-operator-796bbdcf4f-nvb2t\" (UID: \"152c0bd2-ed47-4f77-851d-562166d3bc1f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483582 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483615 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f07ecd-4175-4ed0-b11e-507b0b7a783f-config\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483635 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e761a9a6-bbf9-4bc8-9c36-358522654b25-node-pullsecrets\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483655 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6zj6\" (UniqueName: \"kubernetes.io/projected/5669be4d-29d4-4cee-ad63-75f37e3727d2-kube-api-access-p6zj6\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483677 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2f74\" (UniqueName: \"kubernetes.io/projected/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-kube-api-access-p2f74\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483699 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3469de51-445a-4bda-9f65-d7e62b7ce452-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9xplh\" (UID: \"3469de51-445a-4bda-9f65-d7e62b7ce452\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483739 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e761a9a6-bbf9-4bc8-9c36-358522654b25-audit-dir\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483762 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8985542-d2e9-4677-a112-3caacb378c86-auth-proxy-config\") pod \"machine-approver-56656f9798-kf259\" (UID: \"a8985542-d2e9-4677-a112-3caacb378c86\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483782 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152c0bd2-ed47-4f77-851d-562166d3bc1f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nvb2t\" (UID: \"152c0bd2-ed47-4f77-851d-562166d3bc1f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483803 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483827 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f07ecd-4175-4ed0-b11e-507b0b7a783f-service-ca-bundle\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483848 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-encryption-config\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483869 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-etcd-client\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483888 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3469de51-445a-4bda-9f65-d7e62b7ce452-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9xplh\" (UID: \"3469de51-445a-4bda-9f65-d7e62b7ce452\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483909 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec45fa5-9d85-4aea-b458-79c149b76828-serving-cert\") pod \"openshift-config-operator-7777fb866f-d2jpz\" (UID: \"3ec45fa5-9d85-4aea-b458-79c149b76828\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.483980 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484008 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e761a9a6-bbf9-4bc8-9c36-358522654b25-serving-cert\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484030 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484055 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-config\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484077 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0b17f17-89ff-4f30-a316-6b6a17d8b7a2-metrics-tls\") pod \"ingress-operator-5b745b69d9-qtz49\" (UID: \"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484102 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-etcd-serving-ca\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484126 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484165 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3469de51-445a-4bda-9f65-d7e62b7ce452-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9xplh\" (UID: \"3469de51-445a-4bda-9f65-d7e62b7ce452\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484189 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhszh\" (UniqueName: \"kubernetes.io/projected/513ea4da-405c-4176-adbd-c8e5f68c631c-kube-api-access-lhszh\") pod \"machine-api-operator-5694c8668f-lx5mt\" (UID: \"513ea4da-405c-4176-adbd-c8e5f68c631c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484211 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484242 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-audit\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484266 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484288 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a8985542-d2e9-4677-a112-3caacb378c86-machine-approver-tls\") pod \"machine-approver-56656f9798-kf259\" (UID: \"a8985542-d2e9-4677-a112-3caacb378c86\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484311 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-trusted-ca-bundle\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484336 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86b98\" (UniqueName: \"kubernetes.io/projected/6529df15-5883-406f-a2e1-96fa351af72d-kube-api-access-86b98\") pod \"dns-operator-744455d44c-k9nwr\" (UID: \"6529df15-5883-406f-a2e1-96fa351af72d\") " pod="openshift-dns-operator/dns-operator-744455d44c-k9nwr" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484359 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsfjr\" (UniqueName: \"kubernetes.io/projected/fecbaff0-859c-4fc4-b2fd-56fbc53c192b-kube-api-access-lsfjr\") pod \"migrator-59844c95c7-jmklp\" (UID: \"fecbaff0-859c-4fc4-b2fd-56fbc53c192b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jmklp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484378 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484400 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484420 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-audit-dir\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484443 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gchht\" (UniqueName: \"kubernetes.io/projected/0f573c3d-716d-4f05-9308-6983d8c30570-kube-api-access-gchht\") pod \"console-operator-58897d9998-gdrf7\" (UID: \"0f573c3d-716d-4f05-9308-6983d8c30570\") " pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484462 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e761a9a6-bbf9-4bc8-9c36-358522654b25-encryption-config\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484484 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8db57361-77ab-43d3-acc5-d4de29c8f13e-serving-cert\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484505 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-serving-cert\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484530 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv2hr\" (UniqueName: \"kubernetes.io/projected/8db57361-77ab-43d3-acc5-d4de29c8f13e-kube-api-access-hv2hr\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484551 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-serving-cert\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484572 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f573c3d-716d-4f05-9308-6983d8c30570-serving-cert\") pod \"console-operator-58897d9998-gdrf7\" (UID: \"0f573c3d-716d-4f05-9308-6983d8c30570\") " pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484595 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d53e0cc-045b-4cb2-a5d3-be89b73f98e5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pbxvs\" (UID: \"8d53e0cc-045b-4cb2-a5d3-be89b73f98e5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484621 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8gf9\" (UniqueName: \"kubernetes.io/projected/3469de51-445a-4bda-9f65-d7e62b7ce452-kube-api-access-f8gf9\") pod \"cluster-image-registry-operator-dc59b4c8b-9xplh\" (UID: \"3469de51-445a-4bda-9f65-d7e62b7ce452\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484643 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/152c0bd2-ed47-4f77-851d-562166d3bc1f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nvb2t\" (UID: \"152c0bd2-ed47-4f77-851d-562166d3bc1f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484664 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzcbx\" (UniqueName: \"kubernetes.io/projected/b0b17f17-89ff-4f30-a316-6b6a17d8b7a2-kube-api-access-mzcbx\") pod \"ingress-operator-5b745b69d9-qtz49\" (UID: \"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484691 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-client-ca\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484727 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-service-ca\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484750 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-config\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.484770 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2g6m\" (UniqueName: \"kubernetes.io/projected/3ec45fa5-9d85-4aea-b458-79c149b76828-kube-api-access-j2g6m\") pod \"openshift-config-operator-7777fb866f-d2jpz\" (UID: \"3ec45fa5-9d85-4aea-b458-79c149b76828\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.490171 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f07ecd-4175-4ed0-b11e-507b0b7a783f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.490253 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8985542-d2e9-4677-a112-3caacb378c86-config\") pod \"machine-approver-56656f9798-kf259\" (UID: \"a8985542-d2e9-4677-a112-3caacb378c86\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.490301 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-oauth-serving-cert\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.493853 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/513ea4da-405c-4176-adbd-c8e5f68c631c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lx5mt\" (UID: \"513ea4da-405c-4176-adbd-c8e5f68c631c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.493901 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3ec45fa5-9d85-4aea-b458-79c149b76828-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d2jpz\" (UID: \"3ec45fa5-9d85-4aea-b458-79c149b76828\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.493952 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.494161 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-oauth-config\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.494190 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-audit-policies\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.494392 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6529df15-5883-406f-a2e1-96fa351af72d-metrics-tls\") pod \"dns-operator-744455d44c-k9nwr\" (UID: \"6529df15-5883-406f-a2e1-96fa351af72d\") " pod="openshift-dns-operator/dns-operator-744455d44c-k9nwr" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.494429 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f573c3d-716d-4f05-9308-6983d8c30570-trusted-ca\") pod \"console-operator-58897d9998-gdrf7\" (UID: \"0f573c3d-716d-4f05-9308-6983d8c30570\") " pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.494462 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84vhs\" (UniqueName: \"kubernetes.io/projected/e761a9a6-bbf9-4bc8-9c36-358522654b25-kube-api-access-84vhs\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.494682 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-627ks\" (UniqueName: \"kubernetes.io/projected/a8985542-d2e9-4677-a112-3caacb378c86-kube-api-access-627ks\") pod \"machine-approver-56656f9798-kf259\" (UID: \"a8985542-d2e9-4677-a112-3caacb378c86\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.494713 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d53e0cc-045b-4cb2-a5d3-be89b73f98e5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pbxvs\" (UID: \"8d53e0cc-045b-4cb2-a5d3-be89b73f98e5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.494943 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2x4h\" (UniqueName: \"kubernetes.io/projected/21854278-8a87-40ee-9209-509132febd54-kube-api-access-j2x4h\") pod \"downloads-7954f5f757-dfxlz\" (UID: \"21854278-8a87-40ee-9209-509132febd54\") " pod="openshift-console/downloads-7954f5f757-dfxlz" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.495008 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpcns\" (UniqueName: \"kubernetes.io/projected/b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd-kube-api-access-xpcns\") pod \"cluster-samples-operator-665b6dd947-mdwc7\" (UID: \"b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.495819 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.496758 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.498147 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.498207 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.498427 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.499754 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.499802 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.499828 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.499884 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.500113 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.500217 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.500483 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.500800 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0b17f17-89ff-4f30-a316-6b6a17d8b7a2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qtz49\" (UID: \"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.500894 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-audit-policies\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.500960 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.500993 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-image-import-ca\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.501164 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0818ea3-b629-47a3-8edb-d77e60a23068-audit-dir\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.501224 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/513ea4da-405c-4176-adbd-c8e5f68c631c-images\") pod \"machine-api-operator-5694c8668f-lx5mt\" (UID: \"513ea4da-405c-4176-adbd-c8e5f68c631c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.501257 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f07ecd-4175-4ed0-b11e-507b0b7a783f-serving-cert\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.502813 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-k7smf"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.512383 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.513143 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.514638 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-h4lz2"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.515387 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.515638 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.515847 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.515907 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-h4lz2" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.517497 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.518963 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.519839 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.520151 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.520804 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.521869 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.522608 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.522677 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.523273 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.528273 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.529628 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.532547 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.533251 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.533315 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.533442 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.541678 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.542175 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.542371 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.542462 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.542693 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.542798 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7lss8"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.543108 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.543452 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.543666 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.543842 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.543996 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.544460 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.550977 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6vd86"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.551535 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.551891 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.552068 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.552154 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lx5mt"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.554674 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.554700 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.563258 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.568821 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gdrf7"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.569936 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.571188 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.586409 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.586506 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hjncp"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.586673 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vq67l"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.592713 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-546nt"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.596453 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-5cnmw"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.597526 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5cnmw" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.598782 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dx6bv"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.602869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-audit-policies\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.602907 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d53e0cc-045b-4cb2-a5d3-be89b73f98e5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pbxvs\" (UID: \"8d53e0cc-045b-4cb2-a5d3-be89b73f98e5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.602964 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6529df15-5883-406f-a2e1-96fa351af72d-metrics-tls\") pod \"dns-operator-744455d44c-k9nwr\" (UID: \"6529df15-5883-406f-a2e1-96fa351af72d\") " pod="openshift-dns-operator/dns-operator-744455d44c-k9nwr" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.602995 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f573c3d-716d-4f05-9308-6983d8c30570-trusted-ca\") pod \"console-operator-58897d9998-gdrf7\" (UID: \"0f573c3d-716d-4f05-9308-6983d8c30570\") " pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84vhs\" (UniqueName: \"kubernetes.io/projected/e761a9a6-bbf9-4bc8-9c36-358522654b25-kube-api-access-84vhs\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603037 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-627ks\" (UniqueName: \"kubernetes.io/projected/a8985542-d2e9-4677-a112-3caacb378c86-kube-api-access-627ks\") pod \"machine-approver-56656f9798-kf259\" (UID: \"a8985542-d2e9-4677-a112-3caacb378c86\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603057 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2x4h\" (UniqueName: \"kubernetes.io/projected/21854278-8a87-40ee-9209-509132febd54-kube-api-access-j2x4h\") pod \"downloads-7954f5f757-dfxlz\" (UID: \"21854278-8a87-40ee-9209-509132febd54\") " pod="openshift-console/downloads-7954f5f757-dfxlz" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603083 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpcns\" (UniqueName: \"kubernetes.io/projected/b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd-kube-api-access-xpcns\") pod \"cluster-samples-operator-665b6dd947-mdwc7\" (UID: \"b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0b17f17-89ff-4f30-a316-6b6a17d8b7a2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qtz49\" (UID: \"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603115 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-audit-policies\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603131 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603147 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-image-import-ca\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603160 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0818ea3-b629-47a3-8edb-d77e60a23068-audit-dir\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603184 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f07ecd-4175-4ed0-b11e-507b0b7a783f-serving-cert\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603202 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/513ea4da-405c-4176-adbd-c8e5f68c631c-images\") pod \"machine-api-operator-5694c8668f-lx5mt\" (UID: \"513ea4da-405c-4176-adbd-c8e5f68c631c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603220 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d53e0cc-045b-4cb2-a5d3-be89b73f98e5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pbxvs\" (UID: \"8d53e0cc-045b-4cb2-a5d3-be89b73f98e5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603237 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-config\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603251 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513ea4da-405c-4176-adbd-c8e5f68c631c-config\") pod \"machine-api-operator-5694c8668f-lx5mt\" (UID: \"513ea4da-405c-4176-adbd-c8e5f68c631c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603266 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603284 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0b17f17-89ff-4f30-a316-6b6a17d8b7a2-trusted-ca\") pod \"ingress-operator-5b745b69d9-qtz49\" (UID: \"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603291 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dh9vd"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603301 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603759 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mdwc7\" (UID: \"b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603826 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603858 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz9px\" (UniqueName: \"kubernetes.io/projected/a0818ea3-b629-47a3-8edb-d77e60a23068-kube-api-access-kz9px\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.603984 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604028 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l7kb\" (UniqueName: \"kubernetes.io/projected/80f07ecd-4175-4ed0-b11e-507b0b7a783f-kube-api-access-7l7kb\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604053 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f07ecd-4175-4ed0-b11e-507b0b7a783f-config\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604073 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f573c3d-716d-4f05-9308-6983d8c30570-config\") pod \"console-operator-58897d9998-gdrf7\" (UID: \"0f573c3d-716d-4f05-9308-6983d8c30570\") " pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604096 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e761a9a6-bbf9-4bc8-9c36-358522654b25-etcd-client\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604120 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzz2c\" (UniqueName: \"kubernetes.io/projected/152c0bd2-ed47-4f77-851d-562166d3bc1f-kube-api-access-pzz2c\") pod \"openshift-apiserver-operator-796bbdcf4f-nvb2t\" (UID: \"152c0bd2-ed47-4f77-851d-562166d3bc1f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604141 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604165 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3469de51-445a-4bda-9f65-d7e62b7ce452-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9xplh\" (UID: \"3469de51-445a-4bda-9f65-d7e62b7ce452\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604187 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e761a9a6-bbf9-4bc8-9c36-358522654b25-node-pullsecrets\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604210 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6zj6\" (UniqueName: \"kubernetes.io/projected/5669be4d-29d4-4cee-ad63-75f37e3727d2-kube-api-access-p6zj6\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604236 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2f74\" (UniqueName: \"kubernetes.io/projected/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-kube-api-access-p2f74\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604277 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-encryption-config\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604300 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e761a9a6-bbf9-4bc8-9c36-358522654b25-audit-dir\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604325 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8985542-d2e9-4677-a112-3caacb378c86-auth-proxy-config\") pod \"machine-approver-56656f9798-kf259\" (UID: \"a8985542-d2e9-4677-a112-3caacb378c86\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604348 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152c0bd2-ed47-4f77-851d-562166d3bc1f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nvb2t\" (UID: \"152c0bd2-ed47-4f77-851d-562166d3bc1f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604370 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604395 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f07ecd-4175-4ed0-b11e-507b0b7a783f-service-ca-bundle\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604422 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-etcd-client\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604444 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3469de51-445a-4bda-9f65-d7e62b7ce452-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9xplh\" (UID: \"3469de51-445a-4bda-9f65-d7e62b7ce452\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604469 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec45fa5-9d85-4aea-b458-79c149b76828-serving-cert\") pod \"openshift-config-operator-7777fb866f-d2jpz\" (UID: \"3ec45fa5-9d85-4aea-b458-79c149b76828\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604495 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604518 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604544 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e761a9a6-bbf9-4bc8-9c36-358522654b25-serving-cert\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604563 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-etcd-serving-ca\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604585 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-config\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604608 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0b17f17-89ff-4f30-a316-6b6a17d8b7a2-metrics-tls\") pod \"ingress-operator-5b745b69d9-qtz49\" (UID: \"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604634 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604657 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604691 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3469de51-445a-4bda-9f65-d7e62b7ce452-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9xplh\" (UID: \"3469de51-445a-4bda-9f65-d7e62b7ce452\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604722 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhszh\" (UniqueName: \"kubernetes.io/projected/513ea4da-405c-4176-adbd-c8e5f68c631c-kube-api-access-lhszh\") pod \"machine-api-operator-5694c8668f-lx5mt\" (UID: \"513ea4da-405c-4176-adbd-c8e5f68c631c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604720 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0f573c3d-716d-4f05-9308-6983d8c30570-trusted-ca\") pod \"console-operator-58897d9998-gdrf7\" (UID: \"0f573c3d-716d-4f05-9308-6983d8c30570\") " pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604748 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86b98\" (UniqueName: \"kubernetes.io/projected/6529df15-5883-406f-a2e1-96fa351af72d-kube-api-access-86b98\") pod \"dns-operator-744455d44c-k9nwr\" (UID: \"6529df15-5883-406f-a2e1-96fa351af72d\") " pod="openshift-dns-operator/dns-operator-744455d44c-k9nwr" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604772 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-audit\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604797 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604819 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a8985542-d2e9-4677-a112-3caacb378c86-machine-approver-tls\") pod \"machine-approver-56656f9798-kf259\" (UID: \"a8985542-d2e9-4677-a112-3caacb378c86\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604840 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-trusted-ca-bundle\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604866 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsfjr\" (UniqueName: \"kubernetes.io/projected/fecbaff0-859c-4fc4-b2fd-56fbc53c192b-kube-api-access-lsfjr\") pod \"migrator-59844c95c7-jmklp\" (UID: \"fecbaff0-859c-4fc4-b2fd-56fbc53c192b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jmklp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604888 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604960 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.604987 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-audit-dir\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gchht\" (UniqueName: \"kubernetes.io/projected/0f573c3d-716d-4f05-9308-6983d8c30570-kube-api-access-gchht\") pod \"console-operator-58897d9998-gdrf7\" (UID: \"0f573c3d-716d-4f05-9308-6983d8c30570\") " pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605039 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e761a9a6-bbf9-4bc8-9c36-358522654b25-encryption-config\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605063 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8db57361-77ab-43d3-acc5-d4de29c8f13e-serving-cert\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605085 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f573c3d-716d-4f05-9308-6983d8c30570-serving-cert\") pod \"console-operator-58897d9998-gdrf7\" (UID: \"0f573c3d-716d-4f05-9308-6983d8c30570\") " pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605108 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-serving-cert\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605149 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv2hr\" (UniqueName: \"kubernetes.io/projected/8db57361-77ab-43d3-acc5-d4de29c8f13e-kube-api-access-hv2hr\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605176 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-serving-cert\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605199 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d53e0cc-045b-4cb2-a5d3-be89b73f98e5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pbxvs\" (UID: \"8d53e0cc-045b-4cb2-a5d3-be89b73f98e5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605223 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8gf9\" (UniqueName: \"kubernetes.io/projected/3469de51-445a-4bda-9f65-d7e62b7ce452-kube-api-access-f8gf9\") pod \"cluster-image-registry-operator-dc59b4c8b-9xplh\" (UID: \"3469de51-445a-4bda-9f65-d7e62b7ce452\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605246 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/152c0bd2-ed47-4f77-851d-562166d3bc1f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nvb2t\" (UID: \"152c0bd2-ed47-4f77-851d-562166d3bc1f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605268 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzcbx\" (UniqueName: \"kubernetes.io/projected/b0b17f17-89ff-4f30-a316-6b6a17d8b7a2-kube-api-access-mzcbx\") pod \"ingress-operator-5b745b69d9-qtz49\" (UID: \"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.607159 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-client-ca\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.607231 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-service-ca\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.607256 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-config\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.607285 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2g6m\" (UniqueName: \"kubernetes.io/projected/3ec45fa5-9d85-4aea-b458-79c149b76828-kube-api-access-j2g6m\") pod \"openshift-config-operator-7777fb866f-d2jpz\" (UID: \"3ec45fa5-9d85-4aea-b458-79c149b76828\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.607313 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f07ecd-4175-4ed0-b11e-507b0b7a783f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.607336 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8985542-d2e9-4677-a112-3caacb378c86-config\") pod \"machine-approver-56656f9798-kf259\" (UID: \"a8985542-d2e9-4677-a112-3caacb378c86\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.607360 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-oauth-serving-cert\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.607389 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-oauth-config\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.607409 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/513ea4da-405c-4176-adbd-c8e5f68c631c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lx5mt\" (UID: \"513ea4da-405c-4176-adbd-c8e5f68c631c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.607431 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3ec45fa5-9d85-4aea-b458-79c149b76828-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d2jpz\" (UID: \"3ec45fa5-9d85-4aea-b458-79c149b76828\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.607456 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.608151 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-config\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.608587 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-audit-policies\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.608840 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.608848 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.611520 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.611554 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-audit-policies\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.611967 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a8985542-d2e9-4677-a112-3caacb378c86-auth-proxy-config\") pod \"machine-approver-56656f9798-kf259\" (UID: \"a8985542-d2e9-4677-a112-3caacb378c86\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.612337 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.612624 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.613076 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mdwc7\" (UID: \"b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.616263 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.616323 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.616339 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-h4lz2"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.616998 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-audit\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.617110 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.617324 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0b17f17-89ff-4f30-a316-6b6a17d8b7a2-trusted-ca\") pod \"ingress-operator-5b745b69d9-qtz49\" (UID: \"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.617339 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.617787 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-trusted-ca-bundle\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605408 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-image-import-ca\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.618399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f573c3d-716d-4f05-9308-6983d8c30570-config\") pod \"console-operator-58897d9998-gdrf7\" (UID: \"0f573c3d-716d-4f05-9308-6983d8c30570\") " pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.618968 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-client-ca\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605445 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0818ea3-b629-47a3-8edb-d77e60a23068-audit-dir\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.619663 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e761a9a6-bbf9-4bc8-9c36-358522654b25-serving-cert\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.620125 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-etcd-serving-ca\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.620169 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-service-ca\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605827 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/513ea4da-405c-4176-adbd-c8e5f68c631c-config\") pod \"machine-api-operator-5694c8668f-lx5mt\" (UID: \"513ea4da-405c-4176-adbd-c8e5f68c631c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.620836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e761a9a6-bbf9-4bc8-9c36-358522654b25-config\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.605900 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.606731 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/513ea4da-405c-4176-adbd-c8e5f68c631c-images\") pod \"machine-api-operator-5694c8668f-lx5mt\" (UID: \"513ea4da-405c-4176-adbd-c8e5f68c631c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.621496 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8985542-d2e9-4677-a112-3caacb378c86-config\") pod \"machine-approver-56656f9798-kf259\" (UID: \"a8985542-d2e9-4677-a112-3caacb378c86\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.621802 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/152c0bd2-ed47-4f77-851d-562166d3bc1f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nvb2t\" (UID: \"152c0bd2-ed47-4f77-851d-562166d3bc1f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.621932 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.622203 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-config\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.622789 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-trusted-ca-bundle\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.622979 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.623201 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f573c3d-716d-4f05-9308-6983d8c30570-serving-cert\") pod \"console-operator-58897d9998-gdrf7\" (UID: \"0f573c3d-716d-4f05-9308-6983d8c30570\") " pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.623253 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-etcd-client\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.623574 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-audit-dir\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.623974 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jmklp"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.624131 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.624714 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.625121 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-oauth-serving-cert\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.625243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e761a9a6-bbf9-4bc8-9c36-358522654b25-encryption-config\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.625449 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0b17f17-89ff-4f30-a316-6b6a17d8b7a2-metrics-tls\") pod \"ingress-operator-5b745b69d9-qtz49\" (UID: \"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.626047 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.626245 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e761a9a6-bbf9-4bc8-9c36-358522654b25-audit-dir\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.626545 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3ec45fa5-9d85-4aea-b458-79c149b76828-available-featuregates\") pod \"openshift-config-operator-7777fb866f-d2jpz\" (UID: \"3ec45fa5-9d85-4aea-b458-79c149b76828\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.626770 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3469de51-445a-4bda-9f65-d7e62b7ce452-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9xplh\" (UID: \"3469de51-445a-4bda-9f65-d7e62b7ce452\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.626997 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e761a9a6-bbf9-4bc8-9c36-358522654b25-node-pullsecrets\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.627004 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q2qbx"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.627587 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-serving-cert\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.627942 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-serving-cert\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.628422 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-586sh"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.628550 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8db57361-77ab-43d3-acc5-d4de29c8f13e-serving-cert\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.629867 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.629904 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.630838 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.631884 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e761a9a6-bbf9-4bc8-9c36-358522654b25-etcd-client\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.632138 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.632559 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152c0bd2-ed47-4f77-851d-562166d3bc1f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nvb2t\" (UID: \"152c0bd2-ed47-4f77-851d-562166d3bc1f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.634488 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tmmk6"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.634516 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dfxlz"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.636749 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.636774 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.638019 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.638418 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.638878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-oauth-config\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.640659 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-encryption-config\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.640687 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.640704 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-k9nwr"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.641021 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.641132 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.642004 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6vd86"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.642314 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.642419 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ec45fa5-9d85-4aea-b458-79c149b76828-serving-cert\") pod \"openshift-config-operator-7777fb866f-d2jpz\" (UID: \"3ec45fa5-9d85-4aea-b458-79c149b76828\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.642588 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/513ea4da-405c-4176-adbd-c8e5f68c631c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lx5mt\" (UID: \"513ea4da-405c-4176-adbd-c8e5f68c631c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.642890 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.644705 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.646000 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.646551 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f07ecd-4175-4ed0-b11e-507b0b7a783f-serving-cert\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.646590 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nxdff"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.646667 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f07ecd-4175-4ed0-b11e-507b0b7a783f-config\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.646707 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f07ecd-4175-4ed0-b11e-507b0b7a783f-service-ca-bundle\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.646711 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a8985542-d2e9-4677-a112-3caacb378c86-machine-approver-tls\") pod \"machine-approver-56656f9798-kf259\" (UID: \"a8985542-d2e9-4677-a112-3caacb378c86\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.647336 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f07ecd-4175-4ed0-b11e-507b0b7a783f-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.647478 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nxdff" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.648679 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7lss8"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.650324 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.650701 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.651792 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.653054 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.654297 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.655276 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nxdff"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.656298 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dx6bv"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.657567 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fpzkk"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.658284 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fpzkk" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.659779 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fpzkk"] Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.662483 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.701985 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.722055 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.734292 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3469de51-445a-4bda-9f65-d7e62b7ce452-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9xplh\" (UID: \"3469de51-445a-4bda-9f65-d7e62b7ce452\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.742765 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.763126 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.767145 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d53e0cc-045b-4cb2-a5d3-be89b73f98e5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pbxvs\" (UID: \"8d53e0cc-045b-4cb2-a5d3-be89b73f98e5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.781728 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.801780 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.811000 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d53e0cc-045b-4cb2-a5d3-be89b73f98e5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pbxvs\" (UID: \"8d53e0cc-045b-4cb2-a5d3-be89b73f98e5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.822571 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.842956 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.862071 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.882241 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.902757 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.922122 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.942560 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.962180 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 08 20:07:00 crc kubenswrapper[4781]: I1208 20:07:00.982135 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.002061 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.022941 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.042267 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.062604 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.082720 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.089981 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6529df15-5883-406f-a2e1-96fa351af72d-metrics-tls\") pod \"dns-operator-744455d44c-k9nwr\" (UID: \"6529df15-5883-406f-a2e1-96fa351af72d\") " pod="openshift-dns-operator/dns-operator-744455d44c-k9nwr" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.102685 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.143006 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.162171 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.182985 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.202276 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.222314 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.242171 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.262142 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.282258 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.302812 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.322716 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.342499 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.361992 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.382508 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.402697 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.421954 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.442522 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.462671 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.482552 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.501889 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.523149 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.540856 4781 request.go:700] Waited for 1.017778143s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-controller-manager-operator-config&limit=500&resourceVersion=0 Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.542717 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.563371 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.582788 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.601867 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.622941 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.642878 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.661402 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.682727 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.704775 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.721912 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.742787 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.763744 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.782282 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.802673 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.822896 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.843053 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.863423 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.882478 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.902695 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.922163 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.942136 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.962771 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 08 20:07:01 crc kubenswrapper[4781]: I1208 20:07:01.994495 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.003120 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.023366 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.043125 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.062863 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.083078 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.102961 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.121787 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.142669 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.162196 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.182822 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.202025 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.250055 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2x4h\" (UniqueName: \"kubernetes.io/projected/21854278-8a87-40ee-9209-509132febd54-kube-api-access-j2x4h\") pod \"downloads-7954f5f757-dfxlz\" (UID: \"21854278-8a87-40ee-9209-509132febd54\") " pod="openshift-console/downloads-7954f5f757-dfxlz" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.265433 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b0b17f17-89ff-4f30-a316-6b6a17d8b7a2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qtz49\" (UID: \"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.283949 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpcns\" (UniqueName: \"kubernetes.io/projected/b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd-kube-api-access-xpcns\") pod \"cluster-samples-operator-665b6dd947-mdwc7\" (UID: \"b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.303656 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-627ks\" (UniqueName: \"kubernetes.io/projected/a8985542-d2e9-4677-a112-3caacb378c86-kube-api-access-627ks\") pod \"machine-approver-56656f9798-kf259\" (UID: \"a8985542-d2e9-4677-a112-3caacb378c86\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.319946 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84vhs\" (UniqueName: \"kubernetes.io/projected/e761a9a6-bbf9-4bc8-9c36-358522654b25-kube-api-access-84vhs\") pod \"apiserver-76f77b778f-vq67l\" (UID: \"e761a9a6-bbf9-4bc8-9c36-358522654b25\") " pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.335106 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.343147 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.345671 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86b98\" (UniqueName: \"kubernetes.io/projected/6529df15-5883-406f-a2e1-96fa351af72d-kube-api-access-86b98\") pod \"dns-operator-744455d44c-k9nwr\" (UID: \"6529df15-5883-406f-a2e1-96fa351af72d\") " pod="openshift-dns-operator/dns-operator-744455d44c-k9nwr" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.345885 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dfxlz" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.363059 4781 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.384053 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.415231 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.425766 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d53e0cc-045b-4cb2-a5d3-be89b73f98e5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pbxvs\" (UID: \"8d53e0cc-045b-4cb2-a5d3-be89b73f98e5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.439294 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzz2c\" (UniqueName: \"kubernetes.io/projected/152c0bd2-ed47-4f77-851d-562166d3bc1f-kube-api-access-pzz2c\") pod \"openshift-apiserver-operator-796bbdcf4f-nvb2t\" (UID: \"152c0bd2-ed47-4f77-851d-562166d3bc1f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.457750 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l7kb\" (UniqueName: \"kubernetes.io/projected/80f07ecd-4175-4ed0-b11e-507b0b7a783f-kube-api-access-7l7kb\") pod \"authentication-operator-69f744f599-q2qbx\" (UID: \"80f07ecd-4175-4ed0-b11e-507b0b7a783f\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.474497 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz9px\" (UniqueName: \"kubernetes.io/projected/a0818ea3-b629-47a3-8edb-d77e60a23068-kube-api-access-kz9px\") pod \"oauth-openshift-558db77b4-546nt\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.498821 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.500876 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzcbx\" (UniqueName: \"kubernetes.io/projected/b0b17f17-89ff-4f30-a316-6b6a17d8b7a2-kube-api-access-mzcbx\") pod \"ingress-operator-5b745b69d9-qtz49\" (UID: \"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.507757 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7"] Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.518691 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv2hr\" (UniqueName: \"kubernetes.io/projected/8db57361-77ab-43d3-acc5-d4de29c8f13e-kube-api-access-hv2hr\") pod \"controller-manager-879f6c89f-hjncp\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.535837 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-k9nwr" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.536458 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8gf9\" (UniqueName: \"kubernetes.io/projected/3469de51-445a-4bda-9f65-d7e62b7ce452-kube-api-access-f8gf9\") pod \"cluster-image-registry-operator-dc59b4c8b-9xplh\" (UID: \"3469de51-445a-4bda-9f65-d7e62b7ce452\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.544987 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dfxlz"] Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.552118 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.555234 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2g6m\" (UniqueName: \"kubernetes.io/projected/3ec45fa5-9d85-4aea-b458-79c149b76828-kube-api-access-j2g6m\") pod \"openshift-config-operator-7777fb866f-d2jpz\" (UID: \"3ec45fa5-9d85-4aea-b458-79c149b76828\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.560465 4781 request.go:700] Waited for 1.936948811s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/serviceaccounts/kube-storage-version-migrator-sa/token Dec 08 20:07:02 crc kubenswrapper[4781]: W1208 20:07:02.560642 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21854278_8a87_40ee_9209_509132febd54.slice/crio-c0645a2650aafc7370e13381648e93981dee0b1394f3372ae0b61308d0988b24 WatchSource:0}: Error finding container c0645a2650aafc7370e13381648e93981dee0b1394f3372ae0b61308d0988b24: Status 404 returned error can't find the container with id c0645a2650aafc7370e13381648e93981dee0b1394f3372ae0b61308d0988b24 Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.567454 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.581971 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsfjr\" (UniqueName: \"kubernetes.io/projected/fecbaff0-859c-4fc4-b2fd-56fbc53c192b-kube-api-access-lsfjr\") pod \"migrator-59844c95c7-jmklp\" (UID: \"fecbaff0-859c-4fc4-b2fd-56fbc53c192b\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jmklp" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.597079 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3469de51-445a-4bda-9f65-d7e62b7ce452-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9xplh\" (UID: \"3469de51-445a-4bda-9f65-d7e62b7ce452\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.598816 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.623739 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.627500 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-vq67l"] Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.645442 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gchht\" (UniqueName: \"kubernetes.io/projected/0f573c3d-716d-4f05-9308-6983d8c30570-kube-api-access-gchht\") pod \"console-operator-58897d9998-gdrf7\" (UID: \"0f573c3d-716d-4f05-9308-6983d8c30570\") " pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.655865 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2f74\" (UniqueName: \"kubernetes.io/projected/6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a-kube-api-access-p2f74\") pod \"apiserver-7bbb656c7d-r84kt\" (UID: \"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.662555 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6zj6\" (UniqueName: \"kubernetes.io/projected/5669be4d-29d4-4cee-ad63-75f37e3727d2-kube-api-access-p6zj6\") pod \"console-f9d7485db-tmmk6\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.665750 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.683189 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.685931 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.690493 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhszh\" (UniqueName: \"kubernetes.io/projected/513ea4da-405c-4176-adbd-c8e5f68c631c-kube-api-access-lhszh\") pod \"machine-api-operator-5694c8668f-lx5mt\" (UID: \"513ea4da-405c-4176-adbd-c8e5f68c631c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.705488 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.723735 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.747860 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.766383 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.771145 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.783707 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.786412 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-k9nwr"] Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.786609 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.794575 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.804300 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.804297 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs"] Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.805391 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.807898 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dfxlz" event={"ID":"21854278-8a87-40ee-9209-509132febd54","Type":"ContainerStarted","Data":"c0645a2650aafc7370e13381648e93981dee0b1394f3372ae0b61308d0988b24"} Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.813994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" event={"ID":"a8985542-d2e9-4677-a112-3caacb378c86","Type":"ContainerStarted","Data":"ec18c37cb8a5dc46d7faeb90de336d6c5a1b943baa1d753f1211ad6bcb596e3a"} Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.824053 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vq67l" event={"ID":"e761a9a6-bbf9-4bc8-9c36-358522654b25","Type":"ContainerStarted","Data":"fdcf5deb3509848f153301b4670079210339a8fa9b523da7d3fd92718d2d9c1a"} Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.825390 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.825704 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jmklp" Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.851307 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49"] Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.854495 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q2qbx"] Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.913742 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t"] Dec 08 20:07:02 crc kubenswrapper[4781]: I1208 20:07:02.935858 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-tmmk6"] Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.146288 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.152864 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc1243b0-6d24-4282-a3e7-c1c87296ca09-trusted-ca\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.152969 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bc1243b0-6d24-4282-a3e7-c1c87296ca09-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.153146 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvp5w\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-kube-api-access-mvp5w\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.153223 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bc1243b0-6d24-4282-a3e7-c1c87296ca09-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.153296 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.153355 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bc1243b0-6d24-4282-a3e7-c1c87296ca09-registry-certificates\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.153391 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-registry-tls\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.153419 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-bound-sa-token\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: E1208 20:07:03.153902 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:03.65387915 +0000 UTC m=+139.805162537 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:03 crc kubenswrapper[4781]: W1208 20:07:03.165458 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6529df15_5883_406f_a2e1_96fa351af72d.slice/crio-c4abbfb773fddb5bd50f85019281acb8c94959876a29ebd09d219e4998e6e5b1 WatchSource:0}: Error finding container c4abbfb773fddb5bd50f85019281acb8c94959876a29ebd09d219e4998e6e5b1: Status 404 returned error can't find the container with id c4abbfb773fddb5bd50f85019281acb8c94959876a29ebd09d219e4998e6e5b1 Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.166075 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt"] Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.257340 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.257727 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5580e232-3b42-4950-b361-070ae3378aea-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dp272\" (UID: \"5580e232-3b42-4950-b361-070ae3378aea\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.257760 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d315f8c3-da9d-46c0-ae31-8246ab341423-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2prnc\" (UID: \"d315f8c3-da9d-46c0-ae31-8246ab341423\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.257782 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-serving-cert\") pod \"route-controller-manager-6576b87f9c-tqpvm\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.257811 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bc1243b0-6d24-4282-a3e7-c1c87296ca09-registry-certificates\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.257833 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d315f8c3-da9d-46c0-ae31-8246ab341423-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2prnc\" (UID: \"d315f8c3-da9d-46c0-ae31-8246ab341423\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.257849 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5580e232-3b42-4950-b361-070ae3378aea-config\") pod \"kube-apiserver-operator-766d6c64bb-dp272\" (UID: \"5580e232-3b42-4950-b361-070ae3378aea\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.257873 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-registry-tls\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.257895 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-config\") pod \"route-controller-manager-6576b87f9c-tqpvm\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258139 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3cbdf0-946b-454e-b384-0b65fb672971-config\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: E1208 20:07:03.258211 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:03.758173466 +0000 UTC m=+139.909456853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258258 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d315f8c3-da9d-46c0-ae31-8246ab341423-config\") pod \"kube-controller-manager-operator-78b949d7b-2prnc\" (UID: \"d315f8c3-da9d-46c0-ae31-8246ab341423\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258306 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-bound-sa-token\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258330 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wgqw\" (UniqueName: \"kubernetes.io/projected/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-kube-api-access-6wgqw\") pod \"route-controller-manager-6576b87f9c-tqpvm\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258353 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h59bm\" (UniqueName: \"kubernetes.io/projected/8b3cbdf0-946b-454e-b384-0b65fb672971-kube-api-access-h59bm\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258387 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3cbdf0-946b-454e-b384-0b65fb672971-serving-cert\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258400 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8b3cbdf0-946b-454e-b384-0b65fb672971-etcd-ca\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258421 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc1243b0-6d24-4282-a3e7-c1c87296ca09-trusted-ca\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258436 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-client-ca\") pod \"route-controller-manager-6576b87f9c-tqpvm\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258469 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b3cbdf0-946b-454e-b384-0b65fb672971-etcd-client\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258487 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bc1243b0-6d24-4282-a3e7-c1c87296ca09-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258503 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5580e232-3b42-4950-b361-070ae3378aea-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dp272\" (UID: \"5580e232-3b42-4950-b361-070ae3378aea\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258525 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvp5w\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-kube-api-access-mvp5w\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258561 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b3cbdf0-946b-454e-b384-0b65fb672971-etcd-service-ca\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.258589 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bc1243b0-6d24-4282-a3e7-c1c87296ca09-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.259796 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bc1243b0-6d24-4282-a3e7-c1c87296ca09-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.260273 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc1243b0-6d24-4282-a3e7-c1c87296ca09-trusted-ca\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.260321 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bc1243b0-6d24-4282-a3e7-c1c87296ca09-registry-certificates\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.263472 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-registry-tls\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.265052 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bc1243b0-6d24-4282-a3e7-c1c87296ca09-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.300632 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-bound-sa-token\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.318647 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvp5w\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-kube-api-access-mvp5w\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.359766 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph9qw\" (UniqueName: \"kubernetes.io/projected/b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2-kube-api-access-ph9qw\") pod \"ingress-canary-fpzkk\" (UID: \"b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2\") " pod="openshift-ingress-canary/ingress-canary-fpzkk" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.359825 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wm52\" (UniqueName: \"kubernetes.io/projected/d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb-kube-api-access-4wm52\") pod \"multus-admission-controller-857f4d67dd-h4lz2\" (UID: \"d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h4lz2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.359853 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz6vn\" (UniqueName: \"kubernetes.io/projected/bffb4880-1131-4b65-ad90-ce2a1f549e6e-kube-api-access-qz6vn\") pod \"catalog-operator-68c6474976-nh4xs\" (UID: \"bffb4880-1131-4b65-ad90-ce2a1f549e6e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.359877 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cmzx\" (UniqueName: \"kubernetes.io/projected/6f836b62-7a30-4b02-93e9-d462f3c28d47-kube-api-access-4cmzx\") pod \"openshift-controller-manager-operator-756b6f6bc6-gphmx\" (UID: \"6f836b62-7a30-4b02-93e9-d462f3c28d47\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.359900 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkcbt\" (UniqueName: \"kubernetes.io/projected/c59dd266-8743-4fa4-8641-450ea02d8dd4-kube-api-access-kkcbt\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360007 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d92d2f16-50ee-482a-8d85-d0ac080b1a2a-node-bootstrap-token\") pod \"machine-config-server-5cnmw\" (UID: \"d92d2f16-50ee-482a-8d85-d0ac080b1a2a\") " pod="openshift-machine-config-operator/machine-config-server-5cnmw" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360054 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f2241e4-2e96-4d40-9bbf-aaae0e6948cc-apiservice-cert\") pod \"packageserver-d55dfcdfc-b6gxs\" (UID: \"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360092 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d315f8c3-da9d-46c0-ae31-8246ab341423-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2prnc\" (UID: \"d315f8c3-da9d-46c0-ae31-8246ab341423\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360118 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c59dd266-8743-4fa4-8641-450ea02d8dd4-service-ca-bundle\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360140 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bffb4880-1131-4b65-ad90-ce2a1f549e6e-srv-cert\") pod \"catalog-operator-68c6474976-nh4xs\" (UID: \"bffb4880-1131-4b65-ad90-ce2a1f549e6e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360163 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4640a664-9c50-4360-bdc1-5c177822719a-config\") pod \"service-ca-operator-777779d784-c7h7f\" (UID: \"4640a664-9c50-4360-bdc1-5c177822719a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360216 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5580e232-3b42-4950-b361-070ae3378aea-config\") pod \"kube-apiserver-operator-766d6c64bb-dp272\" (UID: \"5580e232-3b42-4950-b361-070ae3378aea\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360240 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/309ab965-f0fc-44cd-9e5a-dcac0f258329-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xv7vb\" (UID: \"309ab965-f0fc-44cd-9e5a-dcac0f258329\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360299 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-config\") pod \"route-controller-manager-6576b87f9c-tqpvm\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360384 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wgqw\" (UniqueName: \"kubernetes.io/projected/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-kube-api-access-6wgqw\") pod \"route-controller-manager-6576b87f9c-tqpvm\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360430 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7f2241e4-2e96-4d40-9bbf-aaae0e6948cc-tmpfs\") pod \"packageserver-d55dfcdfc-b6gxs\" (UID: \"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360536 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76a2ef7e-aad6-429b-9ce1-824672b5e201-config-volume\") pod \"dns-default-nxdff\" (UID: \"76a2ef7e-aad6-429b-9ce1-824672b5e201\") " pod="openshift-dns/dns-default-nxdff" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360561 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23d965c6-71e4-4594-a70e-aad1e2b24c3f-config-volume\") pod \"collect-profiles-29420400-xh5nm\" (UID: \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360610 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3cbdf0-946b-454e-b384-0b65fb672971-serving-cert\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360667 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/76a2ef7e-aad6-429b-9ce1-824672b5e201-metrics-tls\") pod \"dns-default-nxdff\" (UID: \"76a2ef7e-aad6-429b-9ce1-824672b5e201\") " pod="openshift-dns/dns-default-nxdff" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360690 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfsgb\" (UniqueName: \"kubernetes.io/projected/57b7029d-c823-4d00-9b7e-e606ba5d8bf6-kube-api-access-jfsgb\") pod \"kube-storage-version-migrator-operator-b67b599dd-bhv76\" (UID: \"57b7029d-c823-4d00-9b7e-e606ba5d8bf6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360823 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46gsv\" (UniqueName: \"kubernetes.io/projected/5169f4a3-b20b-474d-af87-a5f29902edbf-kube-api-access-46gsv\") pod \"olm-operator-6b444d44fb-vjwv2\" (UID: \"5169f4a3-b20b-474d-af87-a5f29902edbf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360872 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4640a664-9c50-4360-bdc1-5c177822719a-serving-cert\") pod \"service-ca-operator-777779d784-c7h7f\" (UID: \"4640a664-9c50-4360-bdc1-5c177822719a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360900 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/42eb9160-8b8c-462d-8d3f-1b8d0ec6810d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h6zf7\" (UID: \"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360943 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b7029d-c823-4d00-9b7e-e606ba5d8bf6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bhv76\" (UID: \"57b7029d-c823-4d00-9b7e-e606ba5d8bf6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360968 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/959c7256-02a6-47bb-9b32-64387b359e95-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7lss8\" (UID: \"959c7256-02a6-47bb-9b32-64387b359e95\") " pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.360990 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-registration-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5580e232-3b42-4950-b361-070ae3378aea-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dp272\" (UID: \"5580e232-3b42-4950-b361-070ae3378aea\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361041 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bmsr\" (UniqueName: \"kubernetes.io/projected/4640a664-9c50-4360-bdc1-5c177822719a-kube-api-access-9bmsr\") pod \"service-ca-operator-777779d784-c7h7f\" (UID: \"4640a664-9c50-4360-bdc1-5c177822719a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361065 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2-cert\") pod \"ingress-canary-fpzkk\" (UID: \"b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2\") " pod="openshift-ingress-canary/ingress-canary-fpzkk" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361088 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/42eb9160-8b8c-462d-8d3f-1b8d0ec6810d-images\") pod \"machine-config-operator-74547568cd-h6zf7\" (UID: \"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361112 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1e04db77-5adb-4052-b98c-e241b5f73973-signing-cabundle\") pod \"service-ca-9c57cc56f-6vd86\" (UID: \"1e04db77-5adb-4052-b98c-e241b5f73973\") " pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361134 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-mountpoint-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361194 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf3ca37f-4f9e-444a-9f18-ce4d1419a82c-proxy-tls\") pod \"machine-config-controller-84d6567774-bgrbp\" (UID: \"cf3ca37f-4f9e-444a-9f18-ce4d1419a82c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361219 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-plugins-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361258 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c59dd266-8743-4fa4-8641-450ea02d8dd4-metrics-certs\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361332 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361385 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-serving-cert\") pod \"route-controller-manager-6576b87f9c-tqpvm\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361413 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5169f4a3-b20b-474d-af87-a5f29902edbf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vjwv2\" (UID: \"5169f4a3-b20b-474d-af87-a5f29902edbf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361440 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bffb4880-1131-4b65-ad90-ce2a1f549e6e-profile-collector-cert\") pod \"catalog-operator-68c6474976-nh4xs\" (UID: \"bffb4880-1131-4b65-ad90-ce2a1f549e6e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361465 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b7029d-c823-4d00-9b7e-e606ba5d8bf6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bhv76\" (UID: \"57b7029d-c823-4d00-9b7e-e606ba5d8bf6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361516 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm856\" (UniqueName: \"kubernetes.io/projected/d92d2f16-50ee-482a-8d85-d0ac080b1a2a-kube-api-access-jm856\") pod \"machine-config-server-5cnmw\" (UID: \"d92d2f16-50ee-482a-8d85-d0ac080b1a2a\") " pod="openshift-machine-config-operator/machine-config-server-5cnmw" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361602 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcg74\" (UniqueName: \"kubernetes.io/projected/a9fba569-4fe5-44f1-905c-c0e844f64fca-kube-api-access-rcg74\") pod \"control-plane-machine-set-operator-78cbb6b69f-29wgh\" (UID: \"a9fba569-4fe5-44f1-905c-c0e844f64fca\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361629 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ww98\" (UniqueName: \"kubernetes.io/projected/959c7256-02a6-47bb-9b32-64387b359e95-kube-api-access-4ww98\") pod \"marketplace-operator-79b997595-7lss8\" (UID: \"959c7256-02a6-47bb-9b32-64387b359e95\") " pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361675 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d315f8c3-da9d-46c0-ae31-8246ab341423-config\") pod \"kube-controller-manager-operator-78b949d7b-2prnc\" (UID: \"d315f8c3-da9d-46c0-ae31-8246ab341423\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361696 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5169f4a3-b20b-474d-af87-a5f29902edbf-srv-cert\") pod \"olm-operator-6b444d44fb-vjwv2\" (UID: \"5169f4a3-b20b-474d-af87-a5f29902edbf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361717 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d92d2f16-50ee-482a-8d85-d0ac080b1a2a-certs\") pod \"machine-config-server-5cnmw\" (UID: \"d92d2f16-50ee-482a-8d85-d0ac080b1a2a\") " pod="openshift-machine-config-operator/machine-config-server-5cnmw" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361741 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3cbdf0-946b-454e-b384-0b65fb672971-config\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361786 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2thh\" (UniqueName: \"kubernetes.io/projected/cf3ca37f-4f9e-444a-9f18-ce4d1419a82c-kube-api-access-m2thh\") pod \"machine-config-controller-84d6567774-bgrbp\" (UID: \"cf3ca37f-4f9e-444a-9f18-ce4d1419a82c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361820 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-csi-data-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361866 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h59bm\" (UniqueName: \"kubernetes.io/projected/8b3cbdf0-946b-454e-b384-0b65fb672971-kube-api-access-h59bm\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361884 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42eb9160-8b8c-462d-8d3f-1b8d0ec6810d-proxy-tls\") pod \"machine-config-operator-74547568cd-h6zf7\" (UID: \"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361943 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c59dd266-8743-4fa4-8641-450ea02d8dd4-default-certificate\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361962 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvb5m\" (UniqueName: \"kubernetes.io/projected/309ab965-f0fc-44cd-9e5a-dcac0f258329-kube-api-access-hvb5m\") pod \"package-server-manager-789f6589d5-xv7vb\" (UID: \"309ab965-f0fc-44cd-9e5a-dcac0f258329\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.361981 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23d965c6-71e4-4594-a70e-aad1e2b24c3f-secret-volume\") pod \"collect-profiles-29420400-xh5nm\" (UID: \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.362007 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-socket-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.362023 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8b3cbdf0-946b-454e-b384-0b65fb672971-etcd-ca\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.362104 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-client-ca\") pod \"route-controller-manager-6576b87f9c-tqpvm\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.362121 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f2241e4-2e96-4d40-9bbf-aaae0e6948cc-webhook-cert\") pod \"packageserver-d55dfcdfc-b6gxs\" (UID: \"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.362170 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsp7c\" (UniqueName: \"kubernetes.io/projected/1e04db77-5adb-4052-b98c-e241b5f73973-kube-api-access-zsp7c\") pod \"service-ca-9c57cc56f-6vd86\" (UID: \"1e04db77-5adb-4052-b98c-e241b5f73973\") " pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.362188 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1e04db77-5adb-4052-b98c-e241b5f73973-signing-key\") pod \"service-ca-9c57cc56f-6vd86\" (UID: \"1e04db77-5adb-4052-b98c-e241b5f73973\") " pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.362222 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5580e232-3b42-4950-b361-070ae3378aea-config\") pod \"kube-apiserver-operator-766d6c64bb-dp272\" (UID: \"5580e232-3b42-4950-b361-070ae3378aea\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.362228 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cth9c\" (UniqueName: \"kubernetes.io/projected/7f2241e4-2e96-4d40-9bbf-aaae0e6948cc-kube-api-access-cth9c\") pod \"packageserver-d55dfcdfc-b6gxs\" (UID: \"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.362358 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b3cbdf0-946b-454e-b384-0b65fb672971-config\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363229 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b3cbdf0-946b-454e-b384-0b65fb672971-etcd-client\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363270 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf3ca37f-4f9e-444a-9f18-ce4d1419a82c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bgrbp\" (UID: \"cf3ca37f-4f9e-444a-9f18-ce4d1419a82c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363300 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f836b62-7a30-4b02-93e9-d462f3c28d47-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gphmx\" (UID: \"6f836b62-7a30-4b02-93e9-d462f3c28d47\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363341 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnq9q\" (UniqueName: \"kubernetes.io/projected/42eb9160-8b8c-462d-8d3f-1b8d0ec6810d-kube-api-access-wnq9q\") pod \"machine-config-operator-74547568cd-h6zf7\" (UID: \"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363378 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b3cbdf0-946b-454e-b384-0b65fb672971-etcd-service-ca\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363400 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c59dd266-8743-4fa4-8641-450ea02d8dd4-stats-auth\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363469 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qssh\" (UniqueName: \"kubernetes.io/projected/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-kube-api-access-2qssh\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d315f8c3-da9d-46c0-ae31-8246ab341423-config\") pod \"kube-controller-manager-operator-78b949d7b-2prnc\" (UID: \"d315f8c3-da9d-46c0-ae31-8246ab341423\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363511 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-h4lz2\" (UID: \"d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h4lz2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363533 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/959c7256-02a6-47bb-9b32-64387b359e95-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7lss8\" (UID: \"959c7256-02a6-47bb-9b32-64387b359e95\") " pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363555 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v5qp\" (UniqueName: \"kubernetes.io/projected/23d965c6-71e4-4594-a70e-aad1e2b24c3f-kube-api-access-9v5qp\") pod \"collect-profiles-29420400-xh5nm\" (UID: \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363580 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5580e232-3b42-4950-b361-070ae3378aea-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dp272\" (UID: \"5580e232-3b42-4950-b361-070ae3378aea\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363604 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d315f8c3-da9d-46c0-ae31-8246ab341423-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2prnc\" (UID: \"d315f8c3-da9d-46c0-ae31-8246ab341423\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.363630 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f836b62-7a30-4b02-93e9-d462f3c28d47-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gphmx\" (UID: \"6f836b62-7a30-4b02-93e9-d462f3c28d47\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" Dec 08 20:07:03 crc kubenswrapper[4781]: E1208 20:07:03.364061 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:03.864043277 +0000 UTC m=+140.015326724 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.364613 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-client-ca\") pod \"route-controller-manager-6576b87f9c-tqpvm\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.365706 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpjxn\" (UniqueName: \"kubernetes.io/projected/76a2ef7e-aad6-429b-9ce1-824672b5e201-kube-api-access-tpjxn\") pod \"dns-default-nxdff\" (UID: \"76a2ef7e-aad6-429b-9ce1-824672b5e201\") " pod="openshift-dns/dns-default-nxdff" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.365734 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9fba569-4fe5-44f1-905c-c0e844f64fca-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-29wgh\" (UID: \"a9fba569-4fe5-44f1-905c-c0e844f64fca\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.366047 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8b3cbdf0-946b-454e-b384-0b65fb672971-etcd-service-ca\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.366689 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8b3cbdf0-946b-454e-b384-0b65fb672971-etcd-ca\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.367171 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-config\") pod \"route-controller-manager-6576b87f9c-tqpvm\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.371648 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8b3cbdf0-946b-454e-b384-0b65fb672971-etcd-client\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.372439 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b3cbdf0-946b-454e-b384-0b65fb672971-serving-cert\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.373452 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5580e232-3b42-4950-b361-070ae3378aea-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dp272\" (UID: \"5580e232-3b42-4950-b361-070ae3378aea\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.374572 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-serving-cert\") pod \"route-controller-manager-6576b87f9c-tqpvm\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.374705 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d315f8c3-da9d-46c0-ae31-8246ab341423-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2prnc\" (UID: \"d315f8c3-da9d-46c0-ae31-8246ab341423\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.399562 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d315f8c3-da9d-46c0-ae31-8246ab341423-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2prnc\" (UID: \"d315f8c3-da9d-46c0-ae31-8246ab341423\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.417330 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.425376 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wgqw\" (UniqueName: \"kubernetes.io/projected/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-kube-api-access-6wgqw\") pod \"route-controller-manager-6576b87f9c-tqpvm\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.446062 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.447884 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h59bm\" (UniqueName: \"kubernetes.io/projected/8b3cbdf0-946b-454e-b384-0b65fb672971-kube-api-access-h59bm\") pod \"etcd-operator-b45778765-586sh\" (UID: \"8b3cbdf0-946b-454e-b384-0b65fb672971\") " pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.468868 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469076 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/309ab965-f0fc-44cd-9e5a-dcac0f258329-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xv7vb\" (UID: \"309ab965-f0fc-44cd-9e5a-dcac0f258329\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469128 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7f2241e4-2e96-4d40-9bbf-aaae0e6948cc-tmpfs\") pod \"packageserver-d55dfcdfc-b6gxs\" (UID: \"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469157 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76a2ef7e-aad6-429b-9ce1-824672b5e201-config-volume\") pod \"dns-default-nxdff\" (UID: \"76a2ef7e-aad6-429b-9ce1-824672b5e201\") " pod="openshift-dns/dns-default-nxdff" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469179 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23d965c6-71e4-4594-a70e-aad1e2b24c3f-config-volume\") pod \"collect-profiles-29420400-xh5nm\" (UID: \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469198 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/76a2ef7e-aad6-429b-9ce1-824672b5e201-metrics-tls\") pod \"dns-default-nxdff\" (UID: \"76a2ef7e-aad6-429b-9ce1-824672b5e201\") " pod="openshift-dns/dns-default-nxdff" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469218 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfsgb\" (UniqueName: \"kubernetes.io/projected/57b7029d-c823-4d00-9b7e-e606ba5d8bf6-kube-api-access-jfsgb\") pod \"kube-storage-version-migrator-operator-b67b599dd-bhv76\" (UID: \"57b7029d-c823-4d00-9b7e-e606ba5d8bf6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469248 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46gsv\" (UniqueName: \"kubernetes.io/projected/5169f4a3-b20b-474d-af87-a5f29902edbf-kube-api-access-46gsv\") pod \"olm-operator-6b444d44fb-vjwv2\" (UID: \"5169f4a3-b20b-474d-af87-a5f29902edbf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469267 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4640a664-9c50-4360-bdc1-5c177822719a-serving-cert\") pod \"service-ca-operator-777779d784-c7h7f\" (UID: \"4640a664-9c50-4360-bdc1-5c177822719a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469287 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/42eb9160-8b8c-462d-8d3f-1b8d0ec6810d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h6zf7\" (UID: \"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469307 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/959c7256-02a6-47bb-9b32-64387b359e95-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7lss8\" (UID: \"959c7256-02a6-47bb-9b32-64387b359e95\") " pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469327 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b7029d-c823-4d00-9b7e-e606ba5d8bf6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bhv76\" (UID: \"57b7029d-c823-4d00-9b7e-e606ba5d8bf6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469353 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bmsr\" (UniqueName: \"kubernetes.io/projected/4640a664-9c50-4360-bdc1-5c177822719a-kube-api-access-9bmsr\") pod \"service-ca-operator-777779d784-c7h7f\" (UID: \"4640a664-9c50-4360-bdc1-5c177822719a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469373 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-registration-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469394 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/42eb9160-8b8c-462d-8d3f-1b8d0ec6810d-images\") pod \"machine-config-operator-74547568cd-h6zf7\" (UID: \"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469416 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2-cert\") pod \"ingress-canary-fpzkk\" (UID: \"b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2\") " pod="openshift-ingress-canary/ingress-canary-fpzkk" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469436 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1e04db77-5adb-4052-b98c-e241b5f73973-signing-cabundle\") pod \"service-ca-9c57cc56f-6vd86\" (UID: \"1e04db77-5adb-4052-b98c-e241b5f73973\") " pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469455 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-mountpoint-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469476 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf3ca37f-4f9e-444a-9f18-ce4d1419a82c-proxy-tls\") pod \"machine-config-controller-84d6567774-bgrbp\" (UID: \"cf3ca37f-4f9e-444a-9f18-ce4d1419a82c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469494 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-plugins-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469513 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c59dd266-8743-4fa4-8641-450ea02d8dd4-metrics-certs\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469553 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5169f4a3-b20b-474d-af87-a5f29902edbf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vjwv2\" (UID: \"5169f4a3-b20b-474d-af87-a5f29902edbf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469575 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bffb4880-1131-4b65-ad90-ce2a1f549e6e-profile-collector-cert\") pod \"catalog-operator-68c6474976-nh4xs\" (UID: \"bffb4880-1131-4b65-ad90-ce2a1f549e6e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469598 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b7029d-c823-4d00-9b7e-e606ba5d8bf6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bhv76\" (UID: \"57b7029d-c823-4d00-9b7e-e606ba5d8bf6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469621 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm856\" (UniqueName: \"kubernetes.io/projected/d92d2f16-50ee-482a-8d85-d0ac080b1a2a-kube-api-access-jm856\") pod \"machine-config-server-5cnmw\" (UID: \"d92d2f16-50ee-482a-8d85-d0ac080b1a2a\") " pod="openshift-machine-config-operator/machine-config-server-5cnmw" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469652 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcg74\" (UniqueName: \"kubernetes.io/projected/a9fba569-4fe5-44f1-905c-c0e844f64fca-kube-api-access-rcg74\") pod \"control-plane-machine-set-operator-78cbb6b69f-29wgh\" (UID: \"a9fba569-4fe5-44f1-905c-c0e844f64fca\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469674 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ww98\" (UniqueName: \"kubernetes.io/projected/959c7256-02a6-47bb-9b32-64387b359e95-kube-api-access-4ww98\") pod \"marketplace-operator-79b997595-7lss8\" (UID: \"959c7256-02a6-47bb-9b32-64387b359e95\") " pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469701 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5169f4a3-b20b-474d-af87-a5f29902edbf-srv-cert\") pod \"olm-operator-6b444d44fb-vjwv2\" (UID: \"5169f4a3-b20b-474d-af87-a5f29902edbf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469722 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d92d2f16-50ee-482a-8d85-d0ac080b1a2a-certs\") pod \"machine-config-server-5cnmw\" (UID: \"d92d2f16-50ee-482a-8d85-d0ac080b1a2a\") " pod="openshift-machine-config-operator/machine-config-server-5cnmw" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469746 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2thh\" (UniqueName: \"kubernetes.io/projected/cf3ca37f-4f9e-444a-9f18-ce4d1419a82c-kube-api-access-m2thh\") pod \"machine-config-controller-84d6567774-bgrbp\" (UID: \"cf3ca37f-4f9e-444a-9f18-ce4d1419a82c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469765 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-csi-data-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469789 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42eb9160-8b8c-462d-8d3f-1b8d0ec6810d-proxy-tls\") pod \"machine-config-operator-74547568cd-h6zf7\" (UID: \"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469814 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c59dd266-8743-4fa4-8641-450ea02d8dd4-default-certificate\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469836 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvb5m\" (UniqueName: \"kubernetes.io/projected/309ab965-f0fc-44cd-9e5a-dcac0f258329-kube-api-access-hvb5m\") pod \"package-server-manager-789f6589d5-xv7vb\" (UID: \"309ab965-f0fc-44cd-9e5a-dcac0f258329\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469857 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-socket-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469878 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23d965c6-71e4-4594-a70e-aad1e2b24c3f-secret-volume\") pod \"collect-profiles-29420400-xh5nm\" (UID: \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469912 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f2241e4-2e96-4d40-9bbf-aaae0e6948cc-webhook-cert\") pod \"packageserver-d55dfcdfc-b6gxs\" (UID: \"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469954 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsp7c\" (UniqueName: \"kubernetes.io/projected/1e04db77-5adb-4052-b98c-e241b5f73973-kube-api-access-zsp7c\") pod \"service-ca-9c57cc56f-6vd86\" (UID: \"1e04db77-5adb-4052-b98c-e241b5f73973\") " pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.469978 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1e04db77-5adb-4052-b98c-e241b5f73973-signing-key\") pod \"service-ca-9c57cc56f-6vd86\" (UID: \"1e04db77-5adb-4052-b98c-e241b5f73973\") " pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cth9c\" (UniqueName: \"kubernetes.io/projected/7f2241e4-2e96-4d40-9bbf-aaae0e6948cc-kube-api-access-cth9c\") pod \"packageserver-d55dfcdfc-b6gxs\" (UID: \"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470036 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf3ca37f-4f9e-444a-9f18-ce4d1419a82c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bgrbp\" (UID: \"cf3ca37f-4f9e-444a-9f18-ce4d1419a82c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470062 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f836b62-7a30-4b02-93e9-d462f3c28d47-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gphmx\" (UID: \"6f836b62-7a30-4b02-93e9-d462f3c28d47\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470088 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnq9q\" (UniqueName: \"kubernetes.io/projected/42eb9160-8b8c-462d-8d3f-1b8d0ec6810d-kube-api-access-wnq9q\") pod \"machine-config-operator-74547568cd-h6zf7\" (UID: \"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470113 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c59dd266-8743-4fa4-8641-450ea02d8dd4-stats-auth\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470138 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qssh\" (UniqueName: \"kubernetes.io/projected/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-kube-api-access-2qssh\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470160 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f836b62-7a30-4b02-93e9-d462f3c28d47-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gphmx\" (UID: \"6f836b62-7a30-4b02-93e9-d462f3c28d47\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470182 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-h4lz2\" (UID: \"d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h4lz2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470208 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/959c7256-02a6-47bb-9b32-64387b359e95-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7lss8\" (UID: \"959c7256-02a6-47bb-9b32-64387b359e95\") " pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470230 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v5qp\" (UniqueName: \"kubernetes.io/projected/23d965c6-71e4-4594-a70e-aad1e2b24c3f-kube-api-access-9v5qp\") pod \"collect-profiles-29420400-xh5nm\" (UID: \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470258 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9fba569-4fe5-44f1-905c-c0e844f64fca-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-29wgh\" (UID: \"a9fba569-4fe5-44f1-905c-c0e844f64fca\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470281 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpjxn\" (UniqueName: \"kubernetes.io/projected/76a2ef7e-aad6-429b-9ce1-824672b5e201-kube-api-access-tpjxn\") pod \"dns-default-nxdff\" (UID: \"76a2ef7e-aad6-429b-9ce1-824672b5e201\") " pod="openshift-dns/dns-default-nxdff" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470315 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph9qw\" (UniqueName: \"kubernetes.io/projected/b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2-kube-api-access-ph9qw\") pod \"ingress-canary-fpzkk\" (UID: \"b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2\") " pod="openshift-ingress-canary/ingress-canary-fpzkk" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470341 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cmzx\" (UniqueName: \"kubernetes.io/projected/6f836b62-7a30-4b02-93e9-d462f3c28d47-kube-api-access-4cmzx\") pod \"openshift-controller-manager-operator-756b6f6bc6-gphmx\" (UID: \"6f836b62-7a30-4b02-93e9-d462f3c28d47\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470364 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wm52\" (UniqueName: \"kubernetes.io/projected/d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb-kube-api-access-4wm52\") pod \"multus-admission-controller-857f4d67dd-h4lz2\" (UID: \"d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h4lz2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470388 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz6vn\" (UniqueName: \"kubernetes.io/projected/bffb4880-1131-4b65-ad90-ce2a1f549e6e-kube-api-access-qz6vn\") pod \"catalog-operator-68c6474976-nh4xs\" (UID: \"bffb4880-1131-4b65-ad90-ce2a1f549e6e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470413 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkcbt\" (UniqueName: \"kubernetes.io/projected/c59dd266-8743-4fa4-8641-450ea02d8dd4-kube-api-access-kkcbt\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470437 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d92d2f16-50ee-482a-8d85-d0ac080b1a2a-node-bootstrap-token\") pod \"machine-config-server-5cnmw\" (UID: \"d92d2f16-50ee-482a-8d85-d0ac080b1a2a\") " pod="openshift-machine-config-operator/machine-config-server-5cnmw" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470459 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f2241e4-2e96-4d40-9bbf-aaae0e6948cc-apiservice-cert\") pod \"packageserver-d55dfcdfc-b6gxs\" (UID: \"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470484 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c59dd266-8743-4fa4-8641-450ea02d8dd4-service-ca-bundle\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470505 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bffb4880-1131-4b65-ad90-ce2a1f549e6e-srv-cert\") pod \"catalog-operator-68c6474976-nh4xs\" (UID: \"bffb4880-1131-4b65-ad90-ce2a1f549e6e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470527 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4640a664-9c50-4360-bdc1-5c177822719a-config\") pod \"service-ca-operator-777779d784-c7h7f\" (UID: \"4640a664-9c50-4360-bdc1-5c177822719a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.470549 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/42eb9160-8b8c-462d-8d3f-1b8d0ec6810d-images\") pod \"machine-config-operator-74547568cd-h6zf7\" (UID: \"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:03 crc kubenswrapper[4781]: E1208 20:07:03.470850 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:03.970835294 +0000 UTC m=+140.122118671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.471672 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/42eb9160-8b8c-462d-8d3f-1b8d0ec6810d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h6zf7\" (UID: \"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.472725 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/959c7256-02a6-47bb-9b32-64387b359e95-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7lss8\" (UID: \"959c7256-02a6-47bb-9b32-64387b359e95\") " pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.473030 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-registration-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.473384 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7f2241e4-2e96-4d40-9bbf-aaae0e6948cc-tmpfs\") pod \"packageserver-d55dfcdfc-b6gxs\" (UID: \"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.473613 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23d965c6-71e4-4594-a70e-aad1e2b24c3f-config-volume\") pod \"collect-profiles-29420400-xh5nm\" (UID: \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.474076 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76a2ef7e-aad6-429b-9ce1-824672b5e201-config-volume\") pod \"dns-default-nxdff\" (UID: \"76a2ef7e-aad6-429b-9ce1-824672b5e201\") " pod="openshift-dns/dns-default-nxdff" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.474192 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4640a664-9c50-4360-bdc1-5c177822719a-config\") pod \"service-ca-operator-777779d784-c7h7f\" (UID: \"4640a664-9c50-4360-bdc1-5c177822719a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.474681 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1e04db77-5adb-4052-b98c-e241b5f73973-signing-cabundle\") pod \"service-ca-9c57cc56f-6vd86\" (UID: \"1e04db77-5adb-4052-b98c-e241b5f73973\") " pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.474740 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-mountpoint-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.478668 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cf3ca37f-4f9e-444a-9f18-ce4d1419a82c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bgrbp\" (UID: \"cf3ca37f-4f9e-444a-9f18-ce4d1419a82c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.479526 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f836b62-7a30-4b02-93e9-d462f3c28d47-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gphmx\" (UID: \"6f836b62-7a30-4b02-93e9-d462f3c28d47\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.481483 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-plugins-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.481960 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b7029d-c823-4d00-9b7e-e606ba5d8bf6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-bhv76\" (UID: \"57b7029d-c823-4d00-9b7e-e606ba5d8bf6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.483318 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-csi-data-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.484362 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf3ca37f-4f9e-444a-9f18-ce4d1419a82c-proxy-tls\") pod \"machine-config-controller-84d6567774-bgrbp\" (UID: \"cf3ca37f-4f9e-444a-9f18-ce4d1419a82c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.485585 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-socket-dir\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.485860 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c59dd266-8743-4fa4-8641-450ea02d8dd4-service-ca-bundle\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.486199 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/76a2ef7e-aad6-429b-9ce1-824672b5e201-metrics-tls\") pod \"dns-default-nxdff\" (UID: \"76a2ef7e-aad6-429b-9ce1-824672b5e201\") " pod="openshift-dns/dns-default-nxdff" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.486976 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f836b62-7a30-4b02-93e9-d462f3c28d47-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gphmx\" (UID: \"6f836b62-7a30-4b02-93e9-d462f3c28d47\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.487967 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bffb4880-1131-4b65-ad90-ce2a1f549e6e-profile-collector-cert\") pod \"catalog-operator-68c6474976-nh4xs\" (UID: \"bffb4880-1131-4b65-ad90-ce2a1f549e6e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.488020 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/959c7256-02a6-47bb-9b32-64387b359e95-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7lss8\" (UID: \"959c7256-02a6-47bb-9b32-64387b359e95\") " pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.488073 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d92d2f16-50ee-482a-8d85-d0ac080b1a2a-certs\") pod \"machine-config-server-5cnmw\" (UID: \"d92d2f16-50ee-482a-8d85-d0ac080b1a2a\") " pod="openshift-machine-config-operator/machine-config-server-5cnmw" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.491515 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42eb9160-8b8c-462d-8d3f-1b8d0ec6810d-proxy-tls\") pod \"machine-config-operator-74547568cd-h6zf7\" (UID: \"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.491515 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2-cert\") pod \"ingress-canary-fpzkk\" (UID: \"b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2\") " pod="openshift-ingress-canary/ingress-canary-fpzkk" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.492769 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-h4lz2\" (UID: \"d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h4lz2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.492768 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/309ab965-f0fc-44cd-9e5a-dcac0f258329-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xv7vb\" (UID: \"309ab965-f0fc-44cd-9e5a-dcac0f258329\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.496384 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4640a664-9c50-4360-bdc1-5c177822719a-serving-cert\") pod \"service-ca-operator-777779d784-c7h7f\" (UID: \"4640a664-9c50-4360-bdc1-5c177822719a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.496562 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c59dd266-8743-4fa4-8641-450ea02d8dd4-metrics-certs\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.496631 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b7029d-c823-4d00-9b7e-e606ba5d8bf6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-bhv76\" (UID: \"57b7029d-c823-4d00-9b7e-e606ba5d8bf6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.496774 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d92d2f16-50ee-482a-8d85-d0ac080b1a2a-node-bootstrap-token\") pod \"machine-config-server-5cnmw\" (UID: \"d92d2f16-50ee-482a-8d85-d0ac080b1a2a\") " pod="openshift-machine-config-operator/machine-config-server-5cnmw" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.499506 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5169f4a3-b20b-474d-af87-a5f29902edbf-srv-cert\") pod \"olm-operator-6b444d44fb-vjwv2\" (UID: \"5169f4a3-b20b-474d-af87-a5f29902edbf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.501306 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f2241e4-2e96-4d40-9bbf-aaae0e6948cc-webhook-cert\") pod \"packageserver-d55dfcdfc-b6gxs\" (UID: \"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.501437 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c59dd266-8743-4fa4-8641-450ea02d8dd4-default-certificate\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.501461 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1e04db77-5adb-4052-b98c-e241b5f73973-signing-key\") pod \"service-ca-9c57cc56f-6vd86\" (UID: \"1e04db77-5adb-4052-b98c-e241b5f73973\") " pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.501822 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23d965c6-71e4-4594-a70e-aad1e2b24c3f-secret-volume\") pod \"collect-profiles-29420400-xh5nm\" (UID: \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.501838 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f2241e4-2e96-4d40-9bbf-aaae0e6948cc-apiservice-cert\") pod \"packageserver-d55dfcdfc-b6gxs\" (UID: \"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.501847 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bffb4880-1131-4b65-ad90-ce2a1f549e6e-srv-cert\") pod \"catalog-operator-68c6474976-nh4xs\" (UID: \"bffb4880-1131-4b65-ad90-ce2a1f549e6e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.502365 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5169f4a3-b20b-474d-af87-a5f29902edbf-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vjwv2\" (UID: \"5169f4a3-b20b-474d-af87-a5f29902edbf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.503479 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9fba569-4fe5-44f1-905c-c0e844f64fca-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-29wgh\" (UID: \"a9fba569-4fe5-44f1-905c-c0e844f64fca\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.504293 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c59dd266-8743-4fa4-8641-450ea02d8dd4-stats-auth\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.504472 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5580e232-3b42-4950-b361-070ae3378aea-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dp272\" (UID: \"5580e232-3b42-4950-b361-070ae3378aea\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.520487 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfsgb\" (UniqueName: \"kubernetes.io/projected/57b7029d-c823-4d00-9b7e-e606ba5d8bf6-kube-api-access-jfsgb\") pod \"kube-storage-version-migrator-operator-b67b599dd-bhv76\" (UID: \"57b7029d-c823-4d00-9b7e-e606ba5d8bf6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.556143 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46gsv\" (UniqueName: \"kubernetes.io/projected/5169f4a3-b20b-474d-af87-a5f29902edbf-kube-api-access-46gsv\") pod \"olm-operator-6b444d44fb-vjwv2\" (UID: \"5169f4a3-b20b-474d-af87-a5f29902edbf\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.571815 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: E1208 20:07:03.572425 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:04.072407524 +0000 UTC m=+140.223690901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.580824 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bmsr\" (UniqueName: \"kubernetes.io/projected/4640a664-9c50-4360-bdc1-5c177822719a-kube-api-access-9bmsr\") pod \"service-ca-operator-777779d784-c7h7f\" (UID: \"4640a664-9c50-4360-bdc1-5c177822719a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.583737 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsp7c\" (UniqueName: \"kubernetes.io/projected/1e04db77-5adb-4052-b98c-e241b5f73973-kube-api-access-zsp7c\") pod \"service-ca-9c57cc56f-6vd86\" (UID: \"1e04db77-5adb-4052-b98c-e241b5f73973\") " pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.601483 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v5qp\" (UniqueName: \"kubernetes.io/projected/23d965c6-71e4-4594-a70e-aad1e2b24c3f-kube-api-access-9v5qp\") pod \"collect-profiles-29420400-xh5nm\" (UID: \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.616654 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cth9c\" (UniqueName: \"kubernetes.io/projected/7f2241e4-2e96-4d40-9bbf-aaae0e6948cc-kube-api-access-cth9c\") pod \"packageserver-d55dfcdfc-b6gxs\" (UID: \"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.638494 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qssh\" (UniqueName: \"kubernetes.io/projected/98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba-kube-api-access-2qssh\") pod \"csi-hostpathplugin-dx6bv\" (UID: \"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba\") " pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.660900 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz6vn\" (UniqueName: \"kubernetes.io/projected/bffb4880-1131-4b65-ad90-ce2a1f549e6e-kube-api-access-qz6vn\") pod \"catalog-operator-68c6474976-nh4xs\" (UID: \"bffb4880-1131-4b65-ad90-ce2a1f549e6e\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.675742 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.676496 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:03 crc kubenswrapper[4781]: E1208 20:07:03.676845 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:04.176828103 +0000 UTC m=+140.328111480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.694564 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpjxn\" (UniqueName: \"kubernetes.io/projected/76a2ef7e-aad6-429b-9ce1-824672b5e201-kube-api-access-tpjxn\") pod \"dns-default-nxdff\" (UID: \"76a2ef7e-aad6-429b-9ce1-824672b5e201\") " pod="openshift-dns/dns-default-nxdff" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.707412 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.714509 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph9qw\" (UniqueName: \"kubernetes.io/projected/b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2-kube-api-access-ph9qw\") pod \"ingress-canary-fpzkk\" (UID: \"b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2\") " pod="openshift-ingress-canary/ingress-canary-fpzkk" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.749946 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cmzx\" (UniqueName: \"kubernetes.io/projected/6f836b62-7a30-4b02-93e9-d462f3c28d47-kube-api-access-4cmzx\") pod \"openshift-controller-manager-operator-756b6f6bc6-gphmx\" (UID: \"6f836b62-7a30-4b02-93e9-d462f3c28d47\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.752684 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz"] Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.764095 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wm52\" (UniqueName: \"kubernetes.io/projected/d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb-kube-api-access-4wm52\") pod \"multus-admission-controller-857f4d67dd-h4lz2\" (UID: \"d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-h4lz2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.765495 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-h4lz2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.766334 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jmklp"] Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.770906 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkcbt\" (UniqueName: \"kubernetes.io/projected/c59dd266-8743-4fa4-8641-450ea02d8dd4-kube-api-access-kkcbt\") pod \"router-default-5444994796-k7smf\" (UID: \"c59dd266-8743-4fa4-8641-450ea02d8dd4\") " pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.777349 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: E1208 20:07:03.777751 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:04.277740083 +0000 UTC m=+140.429023450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.780815 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.781818 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lx5mt"] Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.786797 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2thh\" (UniqueName: \"kubernetes.io/projected/cf3ca37f-4f9e-444a-9f18-ce4d1419a82c-kube-api-access-m2thh\") pod \"machine-config-controller-84d6567774-bgrbp\" (UID: \"cf3ca37f-4f9e-444a-9f18-ce4d1419a82c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.803283 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.809117 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hjncp"] Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.809583 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnq9q\" (UniqueName: \"kubernetes.io/projected/42eb9160-8b8c-462d-8d3f-1b8d0ec6810d-kube-api-access-wnq9q\") pod \"machine-config-operator-74547568cd-h6zf7\" (UID: \"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.809885 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.812778 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc"] Dec 08 20:07:03 crc kubenswrapper[4781]: W1208 20:07:03.815737 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ec45fa5_9d85_4aea_b458_79c149b76828.slice/crio-6135cf97917071fd2265761544cf37deb12428225b8750e39b3b9c49148c407c WatchSource:0}: Error finding container 6135cf97917071fd2265761544cf37deb12428225b8750e39b3b9c49148c407c: Status 404 returned error can't find the container with id 6135cf97917071fd2265761544cf37deb12428225b8750e39b3b9c49148c407c Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.820054 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.828366 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.831636 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm856\" (UniqueName: \"kubernetes.io/projected/d92d2f16-50ee-482a-8d85-d0ac080b1a2a-kube-api-access-jm856\") pod \"machine-config-server-5cnmw\" (UID: \"d92d2f16-50ee-482a-8d85-d0ac080b1a2a\") " pod="openshift-machine-config-operator/machine-config-server-5cnmw" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.834379 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.840420 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7" event={"ID":"b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd","Type":"ContainerStarted","Data":"b1bdf03edef18eca45513402bd439d2af501f162cd9762aab109d0a5f16f8c4f"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.840459 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7" event={"ID":"b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd","Type":"ContainerStarted","Data":"2373b85b6ff90e32f9ef1055fbe153d43443055d441191c1185158a18f59e160"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.843248 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcg74\" (UniqueName: \"kubernetes.io/projected/a9fba569-4fe5-44f1-905c-c0e844f64fca-kube-api-access-rcg74\") pod \"control-plane-machine-set-operator-78cbb6b69f-29wgh\" (UID: \"a9fba569-4fe5-44f1-905c-c0e844f64fca\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.846814 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-gdrf7"] Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.848893 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tmmk6" event={"ID":"5669be4d-29d4-4cee-ad63-75f37e3727d2","Type":"ContainerStarted","Data":"d945090b17ce10c0716b15620ab2476d5cec44d360f2e7437f33c6374f2c9b4a"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.848957 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tmmk6" event={"ID":"5669be4d-29d4-4cee-ad63-75f37e3727d2","Type":"ContainerStarted","Data":"7c55f3a079d194b62af3979b302cba81c43c52fab5c2ee5fd903e38be37609de"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.849832 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.851154 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-k9nwr" event={"ID":"6529df15-5883-406f-a2e1-96fa351af72d","Type":"ContainerStarted","Data":"c4abbfb773fddb5bd50f85019281acb8c94959876a29ebd09d219e4998e6e5b1"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.855167 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" event={"ID":"8d53e0cc-045b-4cb2-a5d3-be89b73f98e5","Type":"ContainerStarted","Data":"3325fd31bf04e8bf409fa97ad9c3a5e384895e8969d6b7abfe3c166e423629a0"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.856129 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.858267 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ww98\" (UniqueName: \"kubernetes.io/projected/959c7256-02a6-47bb-9b32-64387b359e95-kube-api-access-4ww98\") pod \"marketplace-operator-79b997595-7lss8\" (UID: \"959c7256-02a6-47bb-9b32-64387b359e95\") " pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.862982 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" event={"ID":"152c0bd2-ed47-4f77-851d-562166d3bc1f","Type":"ContainerStarted","Data":"bb2e5247ff587cf064884d33aee9c25021bd26ddf5bcc69fd30ea4584bff4fe3"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.863022 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" event={"ID":"152c0bd2-ed47-4f77-851d-562166d3bc1f","Type":"ContainerStarted","Data":"b37436c4e6d73fc866830fea4c33f1d6bf826d22c8ee3ca964fd33bc1aaf85e2"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.865047 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" event={"ID":"513ea4da-405c-4176-adbd-c8e5f68c631c","Type":"ContainerStarted","Data":"51248e4ce0de579927c5993aea39eeebb92b9bc451fe971606898d91fb327e16"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.866688 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5cnmw" Dec 08 20:07:03 crc kubenswrapper[4781]: W1208 20:07:03.868055 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8db57361_77ab_43d3_acc5_d4de29c8f13e.slice/crio-946953d16bfbd9399cf82ee863a5c112b7656c5bbd59dd10bad50a1d5a58f78c WatchSource:0}: Error finding container 946953d16bfbd9399cf82ee863a5c112b7656c5bbd59dd10bad50a1d5a58f78c: Status 404 returned error can't find the container with id 946953d16bfbd9399cf82ee863a5c112b7656c5bbd59dd10bad50a1d5a58f78c Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.874399 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" event={"ID":"a8985542-d2e9-4677-a112-3caacb378c86","Type":"ContainerStarted","Data":"c97a0cbe4503bf98955483d5d7ae67301846bec4cd61a32b0b6e1522bce9ceb4"} Dec 08 20:07:03 crc kubenswrapper[4781]: W1208 20:07:03.874510 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd315f8c3_da9d_46c0_ae31_8246ab341423.slice/crio-f6d47cc2dd290691efbb6b20201a5908afa2dcdeaffc341efe47323a07ef565b WatchSource:0}: Error finding container f6d47cc2dd290691efbb6b20201a5908afa2dcdeaffc341efe47323a07ef565b: Status 404 returned error can't find the container with id f6d47cc2dd290691efbb6b20201a5908afa2dcdeaffc341efe47323a07ef565b Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.878544 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:03 crc kubenswrapper[4781]: E1208 20:07:03.879056 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:04.379028765 +0000 UTC m=+140.530312152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.882337 4781 generic.go:334] "Generic (PLEG): container finished" podID="e761a9a6-bbf9-4bc8-9c36-358522654b25" containerID="be0ce9e03dcf6083718dd9e46f7680e7cf49654f5eb62d73273628dc61e1e2bd" exitCode=0 Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.882394 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vq67l" event={"ID":"e761a9a6-bbf9-4bc8-9c36-358522654b25","Type":"ContainerDied","Data":"be0ce9e03dcf6083718dd9e46f7680e7cf49654f5eb62d73273628dc61e1e2bd"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.884816 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm"] Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.887144 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" event={"ID":"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2","Type":"ContainerStarted","Data":"71d27c8338f07359e5f8732da2a0a77bea8a3c5f07e38d523c16206a90cffe7e"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.887192 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" event={"ID":"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2","Type":"ContainerStarted","Data":"8e0ca1b099f388699e2c6924523c9d1f79572cf689e1c3441dfc8984769fb772"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.887455 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.894521 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nxdff" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.895267 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" event={"ID":"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a","Type":"ContainerStarted","Data":"f679ece13ef32d81cd70c38a040825692f1bcc52c055df38460eceeb237a6e81"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.899561 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fpzkk" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.902616 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dfxlz" event={"ID":"21854278-8a87-40ee-9209-509132febd54","Type":"ContainerStarted","Data":"e8d17a01f469bfa4ca0cbfc9d805f3f045a61c2f542c55cd86d0b5a7b125d781"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.903230 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dfxlz" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.904126 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-dfxlz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.904174 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dfxlz" podUID="21854278-8a87-40ee-9209-509132febd54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.906816 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" event={"ID":"3ec45fa5-9d85-4aea-b458-79c149b76828","Type":"ContainerStarted","Data":"6135cf97917071fd2265761544cf37deb12428225b8750e39b3b9c49148c407c"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.911048 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvb5m\" (UniqueName: \"kubernetes.io/projected/309ab965-f0fc-44cd-9e5a-dcac0f258329-kube-api-access-hvb5m\") pod \"package-server-manager-789f6589d5-xv7vb\" (UID: \"309ab965-f0fc-44cd-9e5a-dcac0f258329\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.912249 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" event={"ID":"80f07ecd-4175-4ed0-b11e-507b0b7a783f","Type":"ContainerStarted","Data":"e316a42472faf1485b45a10a1ba5ece0180d6fff9a2eb2627fefe7481de6290e"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.912284 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" event={"ID":"80f07ecd-4175-4ed0-b11e-507b0b7a783f","Type":"ContainerStarted","Data":"518175c4ba273ca44b0f445d8bc91648dc8ebda3a621097e790595386cbea4a6"} Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.927716 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh"] Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.927769 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-546nt"] Dec 08 20:07:03 crc kubenswrapper[4781]: W1208 20:07:03.948351 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb05dbba1_cec3_4a66_aa31_7362bb50ed2f.slice/crio-bcb503b909f9e11263680a63f238fa462d4e9bae5f8d47318680399a45ea463f WatchSource:0}: Error finding container bcb503b909f9e11263680a63f238fa462d4e9bae5f8d47318680399a45ea463f: Status 404 returned error can't find the container with id bcb503b909f9e11263680a63f238fa462d4e9bae5f8d47318680399a45ea463f Dec 08 20:07:03 crc kubenswrapper[4781]: I1208 20:07:03.982055 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:03 crc kubenswrapper[4781]: E1208 20:07:03.982604 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:04.482586811 +0000 UTC m=+140.633870188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:03 crc kubenswrapper[4781]: W1208 20:07:03.995006 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3469de51_445a_4bda_9f65_d7e62b7ce452.slice/crio-0143c8434e7f7e3b8424d47c837fbf1350a75d9e2b2c043626526fd99eeea729 WatchSource:0}: Error finding container 0143c8434e7f7e3b8424d47c837fbf1350a75d9e2b2c043626526fd99eeea729: Status 404 returned error can't find the container with id 0143c8434e7f7e3b8424d47c837fbf1350a75d9e2b2c043626526fd99eeea729 Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.050428 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.056827 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh" Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.074729 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.082693 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:04 crc kubenswrapper[4781]: E1208 20:07:04.083173 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:04.583140352 +0000 UTC m=+140.734423729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.096542 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.111089 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.142753 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.193193 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-586sh"] Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.194346 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:04 crc kubenswrapper[4781]: E1208 20:07:04.194579 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:04.694567899 +0000 UTC m=+140.845851276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.234238 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx"] Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.268780 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272"] Dec 08 20:07:04 crc kubenswrapper[4781]: W1208 20:07:04.270628 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd92d2f16_50ee_482a_8d85_d0ac080b1a2a.slice/crio-d270b2ad9889513f626ca55b09e94ad069bce146dde29dd837c9ae0033819957 WatchSource:0}: Error finding container d270b2ad9889513f626ca55b09e94ad069bce146dde29dd837c9ae0033819957: Status 404 returned error can't find the container with id d270b2ad9889513f626ca55b09e94ad069bce146dde29dd837c9ae0033819957 Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.296372 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:04 crc kubenswrapper[4781]: E1208 20:07:04.296767 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:04.796748076 +0000 UTC m=+140.948031453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:04 crc kubenswrapper[4781]: W1208 20:07:04.377383 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f836b62_7a30_4b02_93e9_d462f3c28d47.slice/crio-81f99ae26e4289c862ab1703c1cc4225c27cccabfebbd89e7dab0c59c92c84f3 WatchSource:0}: Error finding container 81f99ae26e4289c862ab1703c1cc4225c27cccabfebbd89e7dab0c59c92c84f3: Status 404 returned error can't find the container with id 81f99ae26e4289c862ab1703c1cc4225c27cccabfebbd89e7dab0c59c92c84f3 Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.398656 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:04 crc kubenswrapper[4781]: E1208 20:07:04.399065 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:04.899051066 +0000 UTC m=+141.050334443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.409952 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nxdff"] Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.446820 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-h4lz2"] Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.506482 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:04 crc kubenswrapper[4781]: E1208 20:07:04.506995 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:05.006976074 +0000 UTC m=+141.158259451 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.616565 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:04 crc kubenswrapper[4781]: E1208 20:07:04.617054 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:05.117039863 +0000 UTC m=+141.268323240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:04 crc kubenswrapper[4781]: W1208 20:07:04.642659 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76a2ef7e_aad6_429b_9ce1_824672b5e201.slice/crio-ba811b4a7b11508baab8c5158aa8d71794413b87dc23b5dece8dba8e0ef1ed21 WatchSource:0}: Error finding container ba811b4a7b11508baab8c5158aa8d71794413b87dc23b5dece8dba8e0ef1ed21: Status 404 returned error can't find the container with id ba811b4a7b11508baab8c5158aa8d71794413b87dc23b5dece8dba8e0ef1ed21 Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.717121 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:04 crc kubenswrapper[4781]: E1208 20:07:04.717836 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:05.21781139 +0000 UTC m=+141.369094767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.773667 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs"] Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.809993 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76"] Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.818460 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:04 crc kubenswrapper[4781]: E1208 20:07:04.818796 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:05.318783083 +0000 UTC m=+141.470066470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.889111 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6vd86"] Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.928467 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:04 crc kubenswrapper[4781]: E1208 20:07:04.928829 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:05.42881146 +0000 UTC m=+141.580094837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:04 crc kubenswrapper[4781]: I1208 20:07:04.967851 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" event={"ID":"a8985542-d2e9-4677-a112-3caacb378c86","Type":"ContainerStarted","Data":"80b0b1065cace1a26d74322444d2a9153e789be18fcd87753920c9ab8e400c67"} Dec 08 20:07:05 crc kubenswrapper[4781]: W1208 20:07:05.011132 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e04db77_5adb_4052_b98c_e241b5f73973.slice/crio-b94dbcc8add40ef1851a83f7c16bdec59882d97a086c49c2d5277e164474078b WatchSource:0}: Error finding container b94dbcc8add40ef1851a83f7c16bdec59882d97a086c49c2d5277e164474078b: Status 404 returned error can't find the container with id b94dbcc8add40ef1851a83f7c16bdec59882d97a086c49c2d5277e164474078b Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.030586 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dfxlz" podStartSLOduration=121.030568435 podStartE2EDuration="2m1.030568435s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:05.029382252 +0000 UTC m=+141.180665629" watchObservedRunningTime="2025-12-08 20:07:05.030568435 +0000 UTC m=+141.181851812" Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.030712 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:05 crc kubenswrapper[4781]: E1208 20:07:05.032516 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:05.53250163 +0000 UTC m=+141.683785067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.040556 4781 generic.go:334] "Generic (PLEG): container finished" podID="6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a" containerID="38b69028fac240aeb9d40019bc956f05f1bf0e4428b944ef198c06ccc05aafc3" exitCode=0 Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.040659 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" event={"ID":"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a","Type":"ContainerDied","Data":"38b69028fac240aeb9d40019bc956f05f1bf0e4428b944ef198c06ccc05aafc3"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.076429 4781 generic.go:334] "Generic (PLEG): container finished" podID="3ec45fa5-9d85-4aea-b458-79c149b76828" containerID="1441e4dbac4fa63d60fb909a518fad83ffe64ee046af5f6ed91b7038afe46c88" exitCode=0 Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.076518 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" event={"ID":"3ec45fa5-9d85-4aea-b458-79c149b76828","Type":"ContainerDied","Data":"1441e4dbac4fa63d60fb909a518fad83ffe64ee046af5f6ed91b7038afe46c88"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.081000 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" event={"ID":"8d53e0cc-045b-4cb2-a5d3-be89b73f98e5","Type":"ContainerStarted","Data":"f72fde06832fc2dd8f07b11ebc9c83a4d8d08e248a9ec307bc067bba747f2c3f"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.087827 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" event={"ID":"8b3cbdf0-946b-454e-b384-0b65fb672971","Type":"ContainerStarted","Data":"cefe5a13a13ab880006094ae9fa8b4c1a71920ebaf934c8eaec44ef7ca7dcbb6"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.099690 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-tmmk6" podStartSLOduration=121.099669581 podStartE2EDuration="2m1.099669581s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:05.099247629 +0000 UTC m=+141.250531016" watchObservedRunningTime="2025-12-08 20:07:05.099669581 +0000 UTC m=+141.250952958" Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.115851 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" event={"ID":"6f836b62-7a30-4b02-93e9-d462f3c28d47","Type":"ContainerStarted","Data":"81f99ae26e4289c862ab1703c1cc4225c27cccabfebbd89e7dab0c59c92c84f3"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.121293 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" event={"ID":"3469de51-445a-4bda-9f65-d7e62b7ce452","Type":"ContainerStarted","Data":"0143c8434e7f7e3b8424d47c837fbf1350a75d9e2b2c043626526fd99eeea729"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.125100 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-h4lz2" event={"ID":"d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb","Type":"ContainerStarted","Data":"9f07cc6bc506aa96e5fb8f597c1ccc53aa00a2d3af6bcb597e1a59e322698447"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.131256 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:05 crc kubenswrapper[4781]: E1208 20:07:05.135347 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:05.635319234 +0000 UTC m=+141.786602671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.150652 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-q2qbx" podStartSLOduration=121.150632746 podStartE2EDuration="2m1.150632746s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:05.14653918 +0000 UTC m=+141.297822567" watchObservedRunningTime="2025-12-08 20:07:05.150632746 +0000 UTC m=+141.301916123" Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.181982 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nxdff" event={"ID":"76a2ef7e-aad6-429b-9ce1-824672b5e201","Type":"ContainerStarted","Data":"ba811b4a7b11508baab8c5158aa8d71794413b87dc23b5dece8dba8e0ef1ed21"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.202763 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nvb2t" podStartSLOduration=121.202740413 podStartE2EDuration="2m1.202740413s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:05.196013973 +0000 UTC m=+141.347297350" watchObservedRunningTime="2025-12-08 20:07:05.202740413 +0000 UTC m=+141.354023790" Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.203817 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" event={"ID":"b05dbba1-cec3-4a66-aa31-7362bb50ed2f","Type":"ContainerStarted","Data":"412ce4eda377a47b0618b24a4eec4f3295882c420959d4b3e943ca6245b1b99b"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.203870 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" event={"ID":"b05dbba1-cec3-4a66-aa31-7362bb50ed2f","Type":"ContainerStarted","Data":"bcb503b909f9e11263680a63f238fa462d4e9bae5f8d47318680399a45ea463f"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.204225 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.233233 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.235443 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kf259" podStartSLOduration=122.235423103 podStartE2EDuration="2m2.235423103s" podCreationTimestamp="2025-12-08 20:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:05.232182172 +0000 UTC m=+141.383465549" watchObservedRunningTime="2025-12-08 20:07:05.235423103 +0000 UTC m=+141.386706490" Dec 08 20:07:05 crc kubenswrapper[4781]: E1208 20:07:05.236122 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:05.736102302 +0000 UTC m=+141.887385679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.246349 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7" event={"ID":"b251f9a7-b3a4-4c7e-be56-b4e5647cc6fd","Type":"ContainerStarted","Data":"017974d2aef16d0877b3de42f00c6ebc0754cdbcf261a8f22925635de65fbf0d"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.258178 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jmklp" event={"ID":"fecbaff0-859c-4fc4-b2fd-56fbc53c192b","Type":"ContainerStarted","Data":"26a5659df30272d2c41a1c1b3b965c71ae1719adab33d37ed4385187acbe8047"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.265796 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" event={"ID":"b0b17f17-89ff-4f30-a316-6b6a17d8b7a2","Type":"ContainerStarted","Data":"341bb0cf098e64ebcb04960695deab2ef35142b3843fa5a1f72d30785b68afb8"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.276289 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5cnmw" event={"ID":"d92d2f16-50ee-482a-8d85-d0ac080b1a2a","Type":"ContainerStarted","Data":"d270b2ad9889513f626ca55b09e94ad069bce146dde29dd837c9ae0033819957"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.277481 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" event={"ID":"513ea4da-405c-4176-adbd-c8e5f68c631c","Type":"ContainerStarted","Data":"ef6902f2a5f856d5076b5371816e0c08ddbcd520eeb3f5294dab8accdcc022dc"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.282567 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" event={"ID":"5580e232-3b42-4950-b361-070ae3378aea","Type":"ContainerStarted","Data":"159c75dc93aef12194953582f3579df59ec05c1b2a40e676ba08e3fece88d084"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.283814 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs"] Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.303495 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gdrf7" event={"ID":"0f573c3d-716d-4f05-9308-6983d8c30570","Type":"ContainerStarted","Data":"e272445bcf4ee68763d807469385667fc895e02627689e1d9b48a717a8d9db47"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.304134 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.314227 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-546nt" event={"ID":"a0818ea3-b629-47a3-8edb-d77e60a23068","Type":"ContainerStarted","Data":"22862a9ab9d55ac6b7d37ff8b6605b2384b88699d5396ed78868badce08394b8"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.314348 4781 patch_prober.go:28] interesting pod/console-operator-58897d9998-gdrf7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.314379 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-gdrf7" podUID="0f573c3d-716d-4f05-9308-6983d8c30570" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.328531 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" event={"ID":"8db57361-77ab-43d3-acc5-d4de29c8f13e","Type":"ContainerStarted","Data":"946953d16bfbd9399cf82ee863a5c112b7656c5bbd59dd10bad50a1d5a58f78c"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.334293 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:05 crc kubenswrapper[4781]: E1208 20:07:05.334591 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:05.834574134 +0000 UTC m=+141.985857511 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.342065 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" event={"ID":"d315f8c3-da9d-46c0-ae31-8246ab341423","Type":"ContainerStarted","Data":"f6d47cc2dd290691efbb6b20201a5908afa2dcdeaffc341efe47323a07ef565b"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.372726 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-k9nwr" event={"ID":"6529df15-5883-406f-a2e1-96fa351af72d","Type":"ContainerStarted","Data":"872d4a14189f2994e2f17a187048577d053a155b228fce304e627f2c48b8ab04"} Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.377694 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-dfxlz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.377761 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dfxlz" podUID="21854278-8a87-40ee-9209-509132febd54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.438424 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:05 crc kubenswrapper[4781]: E1208 20:07:05.445829 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:05.945810886 +0000 UTC m=+142.097094263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.555109 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:05 crc kubenswrapper[4781]: E1208 20:07:05.557385 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:06.057361827 +0000 UTC m=+142.208645234 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:05 crc kubenswrapper[4781]: W1208 20:07:05.571307 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f2241e4_2e96_4d40_9bbf_aaae0e6948cc.slice/crio-6fa9f914b4a3ad45ce5495031fb71620a2a8d02fb595fedec71b3fd7b388273c WatchSource:0}: Error finding container 6fa9f914b4a3ad45ce5495031fb71620a2a8d02fb595fedec71b3fd7b388273c: Status 404 returned error can't find the container with id 6fa9f914b4a3ad45ce5495031fb71620a2a8d02fb595fedec71b3fd7b388273c Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.667720 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:05 crc kubenswrapper[4781]: E1208 20:07:05.668216 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:06.168202867 +0000 UTC m=+142.319486244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.669516 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fpzkk"] Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.686534 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2"] Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.720645 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" podStartSLOduration=121.720628243 podStartE2EDuration="2m1.720628243s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:05.72014911 +0000 UTC m=+141.871432487" watchObservedRunningTime="2025-12-08 20:07:05.720628243 +0000 UTC m=+141.871911620" Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.747668 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f"] Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.769495 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:05 crc kubenswrapper[4781]: E1208 20:07:05.769907 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:06.26988822 +0000 UTC m=+142.421171597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.871706 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:05 crc kubenswrapper[4781]: E1208 20:07:05.872097 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:06.372081137 +0000 UTC m=+142.523364514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.895273 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pbxvs" podStartSLOduration=121.895246619 podStartE2EDuration="2m1.895246619s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:05.888210561 +0000 UTC m=+142.039493938" watchObservedRunningTime="2025-12-08 20:07:05.895246619 +0000 UTC m=+142.046529996" Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.940632 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh"] Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.961523 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp"] Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.972936 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:05 crc kubenswrapper[4781]: E1208 20:07:05.973387 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:06.473373159 +0000 UTC m=+142.624656536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.976970 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qtz49" podStartSLOduration=121.97695415 podStartE2EDuration="2m1.97695415s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:05.976442345 +0000 UTC m=+142.127725722" watchObservedRunningTime="2025-12-08 20:07:05.97695415 +0000 UTC m=+142.128237527" Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.991571 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dx6bv"] Dec 08 20:07:05 crc kubenswrapper[4781]: I1208 20:07:05.995864 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm"] Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.018230 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb"] Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.026184 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mdwc7" podStartSLOduration=122.026090013 podStartE2EDuration="2m2.026090013s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.012355977 +0000 UTC m=+142.163639354" watchObservedRunningTime="2025-12-08 20:07:06.026090013 +0000 UTC m=+142.177373390" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.030792 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7lss8"] Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.075406 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:06 crc kubenswrapper[4781]: E1208 20:07:06.075772 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:06.575757761 +0000 UTC m=+142.727041138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.122381 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-gdrf7" podStartSLOduration=122.122361114 podStartE2EDuration="2m2.122361114s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.038862763 +0000 UTC m=+142.190146150" watchObservedRunningTime="2025-12-08 20:07:06.122361114 +0000 UTC m=+142.273644491" Dec 08 20:07:06 crc kubenswrapper[4781]: W1208 20:07:06.150409 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98cdcf53_4bbb_4c2b_8fef_e5f3983d03ba.slice/crio-92764f573ade17cc930d7310dce7e611476fe5ca8f41a75381520761e3211fa3 WatchSource:0}: Error finding container 92764f573ade17cc930d7310dce7e611476fe5ca8f41a75381520761e3211fa3: Status 404 returned error can't find the container with id 92764f573ade17cc930d7310dce7e611476fe5ca8f41a75381520761e3211fa3 Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.162616 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7"] Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.177946 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:06 crc kubenswrapper[4781]: E1208 20:07:06.178181 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:06.678130474 +0000 UTC m=+142.829413851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.178631 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:06 crc kubenswrapper[4781]: E1208 20:07:06.179043 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:06.679027269 +0000 UTC m=+142.830310646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.211221 4781 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-tqpvm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.211273 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" podUID="b05dbba1-cec3-4a66-aa31-7362bb50ed2f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.281128 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:06 crc kubenswrapper[4781]: E1208 20:07:06.281326 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:06.781282888 +0000 UTC m=+142.932566265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.281614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:06 crc kubenswrapper[4781]: E1208 20:07:06.281991 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:06.781983897 +0000 UTC m=+142.933267274 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.382449 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:06 crc kubenswrapper[4781]: E1208 20:07:06.382802 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:06.882789025 +0000 UTC m=+143.034072402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.394211 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" event={"ID":"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc","Type":"ContainerStarted","Data":"8ee8b431962a5cfcaa5bdb80c892fff1a498d6232bd73ba55cc568e2cb73d292"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.394248 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" event={"ID":"7f2241e4-2e96-4d40-9bbf-aaae0e6948cc","Type":"ContainerStarted","Data":"6fa9f914b4a3ad45ce5495031fb71620a2a8d02fb595fedec71b3fd7b388273c"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.418878 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jmklp" event={"ID":"fecbaff0-859c-4fc4-b2fd-56fbc53c192b","Type":"ContainerStarted","Data":"520e65b56ac1d628a75738ed149281b991ac47384a7f13ea9d0c9ddbc05c1196"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.418934 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jmklp" event={"ID":"fecbaff0-859c-4fc4-b2fd-56fbc53c192b","Type":"ContainerStarted","Data":"1eb995f4fe3fea3b9ae09c198163eea35e26eea6ab6e792343d3553152e61c4f"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.430963 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" event={"ID":"57b7029d-c823-4d00-9b7e-e606ba5d8bf6","Type":"ContainerStarted","Data":"a837d5fadae7e62c1cf819823191453c338ac0dd480b502c5b56b75e00fdb6dd"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.437817 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vq67l" event={"ID":"e761a9a6-bbf9-4bc8-9c36-358522654b25","Type":"ContainerStarted","Data":"57b4cf582b1b0f9966f26d946102eae55acfd86e6cd08b0cfd39680b17abcda5"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.441338 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" event={"ID":"cf3ca37f-4f9e-444a-9f18-ce4d1419a82c","Type":"ContainerStarted","Data":"e1eefa5198dc913449e956c4a9889226043e44af8ae562ceeeffcd06f9e92ce1"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.453842 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" event={"ID":"513ea4da-405c-4176-adbd-c8e5f68c631c","Type":"ContainerStarted","Data":"ac85416869e8236c3c16635de6e9a85e337a0061a8137d1082b9a6ee4b08a538"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.459200 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-546nt" event={"ID":"a0818ea3-b629-47a3-8edb-d77e60a23068","Type":"ContainerStarted","Data":"d4d4e5e6996266ccbb5991ed7666c810e25c3c2e730d544895b9bbe94153cd35"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.460884 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.465232 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" event={"ID":"959c7256-02a6-47bb-9b32-64387b359e95","Type":"ContainerStarted","Data":"c114df552fff4a23106b697c3536b016df088720229f1421c3eaf85ab65b2ea9"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.468653 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh" event={"ID":"a9fba569-4fe5-44f1-905c-c0e844f64fca","Type":"ContainerStarted","Data":"cebcbf74705744612571f955c01c4a17e2d7dd1d39fc2198f57a1eabad6fc6a7"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.478890 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" event={"ID":"3ec45fa5-9d85-4aea-b458-79c149b76828","Type":"ContainerStarted","Data":"eb5b84f4699212b17edd93e80602486c38a230e3efe7c0b777b374305d120f55"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.479372 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.483498 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:06 crc kubenswrapper[4781]: E1208 20:07:06.484192 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:06.98417815 +0000 UTC m=+143.135461527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.494377 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" event={"ID":"23d965c6-71e4-4594-a70e-aad1e2b24c3f","Type":"ContainerStarted","Data":"418af1b754e2c60126eb77b32c170b70ae47a4d3df3c4addd88cfb47b729118e"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.498503 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jmklp" podStartSLOduration=122.498488323 podStartE2EDuration="2m2.498488323s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.453373783 +0000 UTC m=+142.604657150" watchObservedRunningTime="2025-12-08 20:07:06.498488323 +0000 UTC m=+142.649771700" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.499343 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lx5mt" podStartSLOduration=122.499337937 podStartE2EDuration="2m2.499337937s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.498827902 +0000 UTC m=+142.650111279" watchObservedRunningTime="2025-12-08 20:07:06.499337937 +0000 UTC m=+142.650621314" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.520249 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fpzkk" event={"ID":"b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2","Type":"ContainerStarted","Data":"75dc14a2f57a09ecfbc44d7c9bf03e3d67abdbd30b8a31280c00aa4f6ac2e475"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.532873 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5cnmw" event={"ID":"d92d2f16-50ee-482a-8d85-d0ac080b1a2a","Type":"ContainerStarted","Data":"4e9c9a6b29b7fd60e131cefbe101cb0df9f3017d5d5c75b6c1ebe3ae986de405"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.543370 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-546nt" podStartSLOduration=122.543349146 podStartE2EDuration="2m2.543349146s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.542813011 +0000 UTC m=+142.694096388" watchObservedRunningTime="2025-12-08 20:07:06.543349146 +0000 UTC m=+142.694632523" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.553270 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" event={"ID":"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d","Type":"ContainerStarted","Data":"40eb5c1a4ab0b443d7454965bb39a86163808ac2824927c017e49081e69a5407"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.557131 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" event={"ID":"4640a664-9c50-4360-bdc1-5c177822719a","Type":"ContainerStarted","Data":"4937f5ec5d37c72edd76f10cb63ac1b0a50f550048081fbc6c40fbb8ae9c28a4"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.571748 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" podStartSLOduration=122.571733215 podStartE2EDuration="2m2.571733215s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.571667633 +0000 UTC m=+142.722951010" watchObservedRunningTime="2025-12-08 20:07:06.571733215 +0000 UTC m=+142.723016592" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.581038 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" event={"ID":"309ab965-f0fc-44cd-9e5a-dcac0f258329","Type":"ContainerStarted","Data":"e1dd1c80ea9fa82d258b78b89721ae502670a368605fa34e1ee344e99d58d110"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.584460 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:06 crc kubenswrapper[4781]: E1208 20:07:06.585432 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:07.08540236 +0000 UTC m=+143.236685777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.600973 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" podStartSLOduration=122.600953538 podStartE2EDuration="2m2.600953538s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.59926555 +0000 UTC m=+142.750548927" watchObservedRunningTime="2025-12-08 20:07:06.600953538 +0000 UTC m=+142.752236915" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.625754 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" event={"ID":"6f836b62-7a30-4b02-93e9-d462f3c28d47","Type":"ContainerStarted","Data":"292fec87f53277fe52edc79f9528692c3be79c70501e1b57f866ef1706e0e0d6"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.631451 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-h4lz2" event={"ID":"d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb","Type":"ContainerStarted","Data":"1c6bbec2fff1992a283cb72533749131916ae1681d27663f0b083ff1cfdb4def"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.633679 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-5cnmw" podStartSLOduration=6.633665359 podStartE2EDuration="6.633665359s" podCreationTimestamp="2025-12-08 20:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.632903107 +0000 UTC m=+142.784186504" watchObservedRunningTime="2025-12-08 20:07:06.633665359 +0000 UTC m=+142.784948736" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.640905 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" event={"ID":"5169f4a3-b20b-474d-af87-a5f29902edbf","Type":"ContainerStarted","Data":"321ef062c00a5d7c4dd783701be61acaab8a712edc839ab3656041d058f299ed"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.657993 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nxdff" event={"ID":"76a2ef7e-aad6-429b-9ce1-824672b5e201","Type":"ContainerStarted","Data":"6143bd0e7c3ff7803e5581b209a0a43b04f4ad4c1274de73df42c80fce5a6e88"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.665222 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gphmx" podStartSLOduration=122.665202046 podStartE2EDuration="2m2.665202046s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.664644021 +0000 UTC m=+142.815927398" watchObservedRunningTime="2025-12-08 20:07:06.665202046 +0000 UTC m=+142.816485423" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.687096 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-k9nwr" event={"ID":"6529df15-5883-406f-a2e1-96fa351af72d","Type":"ContainerStarted","Data":"09bd4855158bccf344f86370ab0825ba03990f5a79a995eb38407edbc2cecc80"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.689910 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:06 crc kubenswrapper[4781]: E1208 20:07:06.691134 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:07.191120956 +0000 UTC m=+143.342404413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.696732 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" event={"ID":"8db57361-77ab-43d3-acc5-d4de29c8f13e","Type":"ContainerStarted","Data":"16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.697501 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.703085 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-k7smf" event={"ID":"c59dd266-8743-4fa4-8641-450ea02d8dd4","Type":"ContainerStarted","Data":"acb7001d7d31b63e5f39f5136e3e03b25f8c6bb99742a306c65eafd6c6497baa"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.703132 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-k7smf" event={"ID":"c59dd266-8743-4fa4-8641-450ea02d8dd4","Type":"ContainerStarted","Data":"cc6e6be9ed4af960c8d5837086580849b3316c74f977a82ac3d8db57c9106202"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.711049 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" event={"ID":"1e04db77-5adb-4052-b98c-e241b5f73973","Type":"ContainerStarted","Data":"09716f507c3ff3af74a0b899ebc03c573660b535de6607738d27245e0e33a6fe"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.711093 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" event={"ID":"1e04db77-5adb-4052-b98c-e241b5f73973","Type":"ContainerStarted","Data":"b94dbcc8add40ef1851a83f7c16bdec59882d97a086c49c2d5277e164474078b"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.717028 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-k9nwr" podStartSLOduration=122.717009745 podStartE2EDuration="2m2.717009745s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.71399872 +0000 UTC m=+142.865282097" watchObservedRunningTime="2025-12-08 20:07:06.717009745 +0000 UTC m=+142.868293122" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.786631 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" event={"ID":"3469de51-445a-4bda-9f65-d7e62b7ce452","Type":"ContainerStarted","Data":"4b33575d2912ed58e8f9b0643fc04c404d9f310c2102a7f47c3d627e2f78a4b2"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.790548 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:06 crc kubenswrapper[4781]: E1208 20:07:06.794305 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:07.294287551 +0000 UTC m=+143.445570928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.804210 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-k7smf" podStartSLOduration=122.804185829 podStartE2EDuration="2m2.804185829s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.796844363 +0000 UTC m=+142.948127740" watchObservedRunningTime="2025-12-08 20:07:06.804185829 +0000 UTC m=+142.955469206" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.805840 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" podStartSLOduration=122.805833586 podStartE2EDuration="2m2.805833586s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.772279981 +0000 UTC m=+142.923563358" watchObservedRunningTime="2025-12-08 20:07:06.805833586 +0000 UTC m=+142.957116963" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.823535 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-gdrf7" event={"ID":"0f573c3d-716d-4f05-9308-6983d8c30570","Type":"ContainerStarted","Data":"649c594858fdcaa5a98541f348296c0a58798ba8bffd2f63a6e6822f9a8528e6"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.863802 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" event={"ID":"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba","Type":"ContainerStarted","Data":"92764f573ade17cc930d7310dce7e611476fe5ca8f41a75381520761e3211fa3"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.876264 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" event={"ID":"bffb4880-1131-4b65-ad90-ce2a1f549e6e","Type":"ContainerStarted","Data":"24688bfa930d79b516a28ce5965a2f40c49d6ca09bc5e4d8bb6e8f0701d83459"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.876312 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" event={"ID":"bffb4880-1131-4b65-ad90-ce2a1f549e6e","Type":"ContainerStarted","Data":"b311472a91a8850febf31aa1ced857e4704ac5e7d0113e0652f700c0fefcea6e"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.876585 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.881910 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6vd86" podStartSLOduration=122.881894647 podStartE2EDuration="2m2.881894647s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.833175646 +0000 UTC m=+142.984459023" watchObservedRunningTime="2025-12-08 20:07:06.881894647 +0000 UTC m=+143.033178024" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.893406 4781 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-nh4xs container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.893469 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" podUID="bffb4880-1131-4b65-ad90-ce2a1f549e6e" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Dec 08 20:07:06 crc kubenswrapper[4781]: E1208 20:07:06.896116 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:07.396099657 +0000 UTC m=+143.547383034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.905029 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.913876 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" event={"ID":"8b3cbdf0-946b-454e-b384-0b65fb672971","Type":"ContainerStarted","Data":"607409bf4b5c7791cc0538a5ebd92cf3a9532a7fbbbbd9339c12aae82ac09eca"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.917053 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.923948 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" podStartSLOduration=122.923937241 podStartE2EDuration="2m2.923937241s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.913830966 +0000 UTC m=+143.065114343" watchObservedRunningTime="2025-12-08 20:07:06.923937241 +0000 UTC m=+143.075220618" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.925592 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9xplh" podStartSLOduration=122.925586667 podStartE2EDuration="2m2.925586667s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.892870926 +0000 UTC m=+143.044154303" watchObservedRunningTime="2025-12-08 20:07:06.925586667 +0000 UTC m=+143.076870044" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.964537 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" event={"ID":"d315f8c3-da9d-46c0-ae31-8246ab341423","Type":"ContainerStarted","Data":"c8c2f63d1a720dccb087d86e326eec658a66d7c9caff7fe69f6cd4f617e8a2c3"} Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.965391 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-dfxlz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.965435 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dfxlz" podUID="21854278-8a87-40ee-9209-509132febd54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.977319 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:07:06 crc kubenswrapper[4781]: I1208 20:07:06.978973 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-586sh" podStartSLOduration=122.978938989 podStartE2EDuration="2m2.978938989s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:06.95587422 +0000 UTC m=+143.107157597" watchObservedRunningTime="2025-12-08 20:07:06.978938989 +0000 UTC m=+143.130222366" Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.005787 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:07 crc kubenswrapper[4781]: E1208 20:07:07.006766 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:07.506752812 +0000 UTC m=+143.658036189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.050128 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2prnc" podStartSLOduration=123.050106953 podStartE2EDuration="2m3.050106953s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:07.002138392 +0000 UTC m=+143.153421769" watchObservedRunningTime="2025-12-08 20:07:07.050106953 +0000 UTC m=+143.201390330" Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.052288 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.060432 4781 patch_prober.go:28] interesting pod/router-default-5444994796-k7smf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 20:07:07 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 08 20:07:07 crc kubenswrapper[4781]: [+]process-running ok Dec 08 20:07:07 crc kubenswrapper[4781]: healthz check failed Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.060490 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7smf" podUID="c59dd266-8743-4fa4-8641-450ea02d8dd4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.094270 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.122899 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.128273 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-gdrf7" Dec 08 20:07:07 crc kubenswrapper[4781]: E1208 20:07:07.132160 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:07.632127412 +0000 UTC m=+143.783410789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.223943 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:07 crc kubenswrapper[4781]: E1208 20:07:07.224174 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:07.724144972 +0000 UTC m=+143.875428389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.224372 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:07 crc kubenswrapper[4781]: E1208 20:07:07.224753 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:07.724738719 +0000 UTC m=+143.876022156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.327734 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:07 crc kubenswrapper[4781]: E1208 20:07:07.328631 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:07.828361146 +0000 UTC m=+143.979644523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.429580 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:07 crc kubenswrapper[4781]: E1208 20:07:07.430485 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:07.93047118 +0000 UTC m=+144.081754557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.530953 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:07 crc kubenswrapper[4781]: E1208 20:07:07.531447 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:08.031432153 +0000 UTC m=+144.182715530 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.633605 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:07 crc kubenswrapper[4781]: E1208 20:07:07.634076 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:08.134005101 +0000 UTC m=+144.285288478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.734463 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:07 crc kubenswrapper[4781]: E1208 20:07:07.735087 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:08.235065816 +0000 UTC m=+144.386349193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.836591 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:07 crc kubenswrapper[4781]: E1208 20:07:07.837029 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:08.337015106 +0000 UTC m=+144.488298483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.937269 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:07 crc kubenswrapper[4781]: E1208 20:07:07.937481 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:08.437455814 +0000 UTC m=+144.588739181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.937855 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:07 crc kubenswrapper[4781]: E1208 20:07:07.938252 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:08.438239086 +0000 UTC m=+144.589522473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:07 crc kubenswrapper[4781]: I1208 20:07:07.996462 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" event={"ID":"6ba1bc9f-dfc2-4d1f-b72d-c364be87fd6a","Type":"ContainerStarted","Data":"01355316adef4f5cab9391095bd364fba2c6b068431b9403908a44c7b198c4c1"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.004685 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fpzkk" event={"ID":"b637e90e-ee1f-4a50-b1ac-0f35d53b2bd2","Type":"ContainerStarted","Data":"6ab07b2c210df024122c829df82d7641ff2f3c815b89f35d131581ca2bbef822"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.012949 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-vq67l" event={"ID":"e761a9a6-bbf9-4bc8-9c36-358522654b25","Type":"ContainerStarted","Data":"824a3c60fe8079e5ff01ac4b0532b45f1a8c09c047a23fc7c559749da4a972b7"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.017616 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" event={"ID":"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d","Type":"ContainerStarted","Data":"38abc25bc5e508083b3e9547e941d69dfd4701836f114f76a8827c3710d6d07b"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.017693 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" event={"ID":"42eb9160-8b8c-462d-8d3f-1b8d0ec6810d","Type":"ContainerStarted","Data":"a70c6ee80562e7a68dc906547b3e493bb9eeecd8be25c135d510b8b2121c3701"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.024755 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" event={"ID":"959c7256-02a6-47bb-9b32-64387b359e95","Type":"ContainerStarted","Data":"505d31ce1521e6161c544539096a8dc286d1e53b29759340e78e0a58d236ed05"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.025828 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.027329 4781 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7lss8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.027370 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" podUID="959c7256-02a6-47bb-9b32-64387b359e95" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.030380 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh" event={"ID":"a9fba569-4fe5-44f1-905c-c0e844f64fca","Type":"ContainerStarted","Data":"900147e17c6b104ab8adea4ee89787042952290eb1aac772e9d1c5df368adad5"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.036987 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" event={"ID":"5580e232-3b42-4950-b361-070ae3378aea","Type":"ContainerStarted","Data":"5619fb2ec7fb768afd4531f93f6523c2454c76bde677950240513c3b11fdd7e1"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.040242 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:08 crc kubenswrapper[4781]: E1208 20:07:08.040675 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:08.540658439 +0000 UTC m=+144.691941816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.051997 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" event={"ID":"23d965c6-71e4-4594-a70e-aad1e2b24c3f","Type":"ContainerStarted","Data":"ca17385bea9f01b4b036c14fe4357bd0b051eacdb6cbf7ecf3be6e1db5d6016b"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.056641 4781 patch_prober.go:28] interesting pod/router-default-5444994796-k7smf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 20:07:08 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 08 20:07:08 crc kubenswrapper[4781]: [+]process-running ok Dec 08 20:07:08 crc kubenswrapper[4781]: healthz check failed Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.056692 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7smf" podUID="c59dd266-8743-4fa4-8641-450ea02d8dd4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.058434 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" podStartSLOduration=124.058420079 podStartE2EDuration="2m4.058420079s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.055980271 +0000 UTC m=+144.207263658" watchObservedRunningTime="2025-12-08 20:07:08.058420079 +0000 UTC m=+144.209703456" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.062011 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-h4lz2" event={"ID":"d3f6e0c4-9f69-4a5e-aee8-a7fd0aea0bdb","Type":"ContainerStarted","Data":"7392e49965a4080089145b6862b1e8e0a8bdd3ee3dcf431050b21caf0817ac22"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.077144 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" event={"ID":"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba","Type":"ContainerStarted","Data":"08ad05c2152ab2cb1ae99c0671319ab2d4f361861c02959205e40c33328d2aa8"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.088254 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h6zf7" podStartSLOduration=124.088239539 podStartE2EDuration="2m4.088239539s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.085527172 +0000 UTC m=+144.236810549" watchObservedRunningTime="2025-12-08 20:07:08.088239539 +0000 UTC m=+144.239522916" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.093224 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" event={"ID":"57b7029d-c823-4d00-9b7e-e606ba5d8bf6","Type":"ContainerStarted","Data":"5bf7868e52bcbe9374282a6b977ddfce132691a07908da0f37ce4241a9ec1754"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.098522 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c7h7f" event={"ID":"4640a664-9c50-4360-bdc1-5c177822719a","Type":"ContainerStarted","Data":"6177a8ac7e2e26d06e60fab658666cf39e7a766c2c411ae3c39ee08fa1ae1ac0"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.107495 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nxdff" event={"ID":"76a2ef7e-aad6-429b-9ce1-824672b5e201","Type":"ContainerStarted","Data":"5fe89460986ffe41e0c1c1eb4fded010b7dbfeca31e1269194e40030190e1b9e"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.108223 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-nxdff" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.118498 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" event={"ID":"309ab965-f0fc-44cd-9e5a-dcac0f258329","Type":"ContainerStarted","Data":"fbba8e0a9db1d0e2dde47f85dc392b757082fec5f22685a6e96fb669c3fb3098"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.118577 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" event={"ID":"309ab965-f0fc-44cd-9e5a-dcac0f258329","Type":"ContainerStarted","Data":"254bd3dd6d36d007e6aa44d577f6ceb2d50e9b2a91420a159266e33d9f8e3290"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.119282 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.120223 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dp272" podStartSLOduration=124.120209939 podStartE2EDuration="2m4.120209939s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.118165311 +0000 UTC m=+144.269448698" watchObservedRunningTime="2025-12-08 20:07:08.120209939 +0000 UTC m=+144.271493316" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.135841 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" event={"ID":"5169f4a3-b20b-474d-af87-a5f29902edbf","Type":"ContainerStarted","Data":"4ba1cfdf39f3e1888802e8801505768036cc5d11996b7be9fcfa7b6785c33a2c"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.136415 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.144559 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.145549 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fpzkk" podStartSLOduration=8.145527262 podStartE2EDuration="8.145527262s" podCreationTimestamp="2025-12-08 20:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.143902416 +0000 UTC m=+144.295185793" watchObservedRunningTime="2025-12-08 20:07:08.145527262 +0000 UTC m=+144.296810639" Dec 08 20:07:08 crc kubenswrapper[4781]: E1208 20:07:08.148434 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:08.648422933 +0000 UTC m=+144.799706310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.158383 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" event={"ID":"cf3ca37f-4f9e-444a-9f18-ce4d1419a82c","Type":"ContainerStarted","Data":"83626fe10e13001550124903d7598afbfa75c63490f795b811b07df6748748e6"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.158423 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" event={"ID":"cf3ca37f-4f9e-444a-9f18-ce4d1419a82c","Type":"ContainerStarted","Data":"bd15231281563f5d929c22df759f28f82b1c147d205c8dd13e698090f16d9ab1"} Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.184231 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.196581 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.196602 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-29wgh" podStartSLOduration=124.196573009 podStartE2EDuration="2m4.196573009s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.189538381 +0000 UTC m=+144.340821758" watchObservedRunningTime="2025-12-08 20:07:08.196573009 +0000 UTC m=+144.347856396" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.236047 4781 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-b6gxs container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.236123 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" podUID="7f2241e4-2e96-4d40-9bbf-aaae0e6948cc" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.237244 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" podStartSLOduration=125.237228123 podStartE2EDuration="2m5.237228123s" podCreationTimestamp="2025-12-08 20:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.234544828 +0000 UTC m=+144.385828215" watchObservedRunningTime="2025-12-08 20:07:08.237228123 +0000 UTC m=+144.388511500" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.247700 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:08 crc kubenswrapper[4781]: E1208 20:07:08.249420 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:08.749400176 +0000 UTC m=+144.900683553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.250653 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nh4xs" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.291329 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-vq67l" podStartSLOduration=124.291311086 podStartE2EDuration="2m4.291311086s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.289458564 +0000 UTC m=+144.440741951" watchObservedRunningTime="2025-12-08 20:07:08.291311086 +0000 UTC m=+144.442594463" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.327361 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" podStartSLOduration=124.32733882 podStartE2EDuration="2m4.32733882s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.323810211 +0000 UTC m=+144.475093588" watchObservedRunningTime="2025-12-08 20:07:08.32733882 +0000 UTC m=+144.478622197" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.349754 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:08 crc kubenswrapper[4781]: E1208 20:07:08.355590 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:08.855573785 +0000 UTC m=+145.006857162 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.430372 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-bhv76" podStartSLOduration=124.43034793 podStartE2EDuration="2m4.43034793s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.376234977 +0000 UTC m=+144.527518354" watchObservedRunningTime="2025-12-08 20:07:08.43034793 +0000 UTC m=+144.581631307" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.454073 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:08 crc kubenswrapper[4781]: E1208 20:07:08.454453 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:08.954433138 +0000 UTC m=+145.105716525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.469483 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bgrbp" podStartSLOduration=124.469464202 podStartE2EDuration="2m4.469464202s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.431467182 +0000 UTC m=+144.582750569" watchObservedRunningTime="2025-12-08 20:07:08.469464202 +0000 UTC m=+144.620747579" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.503353 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-h4lz2" podStartSLOduration=124.503336525 podStartE2EDuration="2m4.503336525s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.472007043 +0000 UTC m=+144.623290440" watchObservedRunningTime="2025-12-08 20:07:08.503336525 +0000 UTC m=+144.654619902" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.506165 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" podStartSLOduration=124.506157805 podStartE2EDuration="2m4.506157805s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.501553515 +0000 UTC m=+144.652836892" watchObservedRunningTime="2025-12-08 20:07:08.506157805 +0000 UTC m=+144.657441182" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.556076 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:08 crc kubenswrapper[4781]: E1208 20:07:08.556669 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:09.056654146 +0000 UTC m=+145.207937523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.585132 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vjwv2" podStartSLOduration=124.585109757 podStartE2EDuration="2m4.585109757s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.544259887 +0000 UTC m=+144.695543264" watchObservedRunningTime="2025-12-08 20:07:08.585109757 +0000 UTC m=+144.736393134" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.585822 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" podStartSLOduration=124.585814387 podStartE2EDuration="2m4.585814387s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.582335479 +0000 UTC m=+144.733618856" watchObservedRunningTime="2025-12-08 20:07:08.585814387 +0000 UTC m=+144.737097764" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.658150 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:08 crc kubenswrapper[4781]: E1208 20:07:08.658472 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:09.158457102 +0000 UTC m=+145.309740479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.680938 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nxdff" podStartSLOduration=8.680905734 podStartE2EDuration="8.680905734s" podCreationTimestamp="2025-12-08 20:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:08.679799333 +0000 UTC m=+144.831082720" watchObservedRunningTime="2025-12-08 20:07:08.680905734 +0000 UTC m=+144.832189101" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.759660 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:08 crc kubenswrapper[4781]: E1208 20:07:08.760183 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:09.260146445 +0000 UTC m=+145.411429822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.860567 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:08 crc kubenswrapper[4781]: E1208 20:07:08.860716 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:09.360692696 +0000 UTC m=+145.511976083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.860849 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:08 crc kubenswrapper[4781]: E1208 20:07:08.861206 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:09.36119581 +0000 UTC m=+145.512479197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.958320 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d2rbv"] Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.959443 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.961613 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:08 crc kubenswrapper[4781]: E1208 20:07:08.961737 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:09.46172128 +0000 UTC m=+145.613004657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.961855 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:08 crc kubenswrapper[4781]: E1208 20:07:08.962168 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:09.462157973 +0000 UTC m=+145.613441350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:08 crc kubenswrapper[4781]: I1208 20:07:08.989444 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.057785 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d2rbv"] Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.057848 4781 patch_prober.go:28] interesting pod/router-default-5444994796-k7smf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 20:07:09 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 08 20:07:09 crc kubenswrapper[4781]: [+]process-running ok Dec 08 20:07:09 crc kubenswrapper[4781]: healthz check failed Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.057890 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7smf" podUID="c59dd266-8743-4fa4-8641-450ea02d8dd4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.063533 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:09 crc kubenswrapper[4781]: E1208 20:07:09.063716 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:09.563683071 +0000 UTC m=+145.714966448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.063796 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-utilities\") pod \"community-operators-d2rbv\" (UID: \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\") " pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.063873 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcgb5\" (UniqueName: \"kubernetes.io/projected/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-kube-api-access-dcgb5\") pod \"community-operators-d2rbv\" (UID: \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\") " pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.063947 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-catalog-content\") pod \"community-operators-d2rbv\" (UID: \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\") " pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.063995 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:09 crc kubenswrapper[4781]: E1208 20:07:09.064277 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:09.564264617 +0000 UTC m=+145.715548054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.166021 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.166259 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcgb5\" (UniqueName: \"kubernetes.io/projected/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-kube-api-access-dcgb5\") pod \"community-operators-d2rbv\" (UID: \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\") " pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.166313 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-catalog-content\") pod \"community-operators-d2rbv\" (UID: \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\") " pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.166424 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-utilities\") pod \"community-operators-d2rbv\" (UID: \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\") " pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:07:09 crc kubenswrapper[4781]: E1208 20:07:09.166504 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:09.666473995 +0000 UTC m=+145.817757382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.166869 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-utilities\") pod \"community-operators-d2rbv\" (UID: \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\") " pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.167123 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-catalog-content\") pod \"community-operators-d2rbv\" (UID: \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\") " pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.171948 4781 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7lss8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.172019 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" podUID="959c7256-02a6-47bb-9b32-64387b359e95" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.185516 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-d2jpz" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.193760 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ghptn"] Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.194887 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.210283 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.230061 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcgb5\" (UniqueName: \"kubernetes.io/projected/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-kube-api-access-dcgb5\") pod \"community-operators-d2rbv\" (UID: \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\") " pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.248905 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ghptn"] Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.268156 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.294608 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:07:09 crc kubenswrapper[4781]: E1208 20:07:09.300685 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:09.800639152 +0000 UTC m=+145.951922529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.369590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.369954 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7bj9\" (UniqueName: \"kubernetes.io/projected/924b676c-8556-4d07-bf4b-a3607f3780d1-kube-api-access-c7bj9\") pod \"certified-operators-ghptn\" (UID: \"924b676c-8556-4d07-bf4b-a3607f3780d1\") " pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.370001 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/924b676c-8556-4d07-bf4b-a3607f3780d1-catalog-content\") pod \"certified-operators-ghptn\" (UID: \"924b676c-8556-4d07-bf4b-a3607f3780d1\") " pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.370147 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/924b676c-8556-4d07-bf4b-a3607f3780d1-utilities\") pod \"certified-operators-ghptn\" (UID: \"924b676c-8556-4d07-bf4b-a3607f3780d1\") " pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:07:09 crc kubenswrapper[4781]: E1208 20:07:09.370279 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:09.870260732 +0000 UTC m=+146.021544109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.387759 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t6j47"] Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.400243 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.420250 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t6j47"] Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.473524 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7bj9\" (UniqueName: \"kubernetes.io/projected/924b676c-8556-4d07-bf4b-a3607f3780d1-kube-api-access-c7bj9\") pod \"certified-operators-ghptn\" (UID: \"924b676c-8556-4d07-bf4b-a3607f3780d1\") " pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.473888 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/924b676c-8556-4d07-bf4b-a3607f3780d1-catalog-content\") pod \"certified-operators-ghptn\" (UID: \"924b676c-8556-4d07-bf4b-a3607f3780d1\") " pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.473952 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41421b22-7fd3-4c47-a406-1681ecda5d10-catalog-content\") pod \"community-operators-t6j47\" (UID: \"41421b22-7fd3-4c47-a406-1681ecda5d10\") " pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.474034 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:09 crc kubenswrapper[4781]: E1208 20:07:09.474370 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:09.974355663 +0000 UTC m=+146.125639050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.474492 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/924b676c-8556-4d07-bf4b-a3607f3780d1-catalog-content\") pod \"certified-operators-ghptn\" (UID: \"924b676c-8556-4d07-bf4b-a3607f3780d1\") " pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.474542 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/924b676c-8556-4d07-bf4b-a3607f3780d1-utilities\") pod \"certified-operators-ghptn\" (UID: \"924b676c-8556-4d07-bf4b-a3607f3780d1\") " pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.474960 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/924b676c-8556-4d07-bf4b-a3607f3780d1-utilities\") pod \"certified-operators-ghptn\" (UID: \"924b676c-8556-4d07-bf4b-a3607f3780d1\") " pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.475016 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41421b22-7fd3-4c47-a406-1681ecda5d10-utilities\") pod \"community-operators-t6j47\" (UID: \"41421b22-7fd3-4c47-a406-1681ecda5d10\") " pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.475096 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qffs\" (UniqueName: \"kubernetes.io/projected/41421b22-7fd3-4c47-a406-1681ecda5d10-kube-api-access-9qffs\") pod \"community-operators-t6j47\" (UID: \"41421b22-7fd3-4c47-a406-1681ecda5d10\") " pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.504281 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7bj9\" (UniqueName: \"kubernetes.io/projected/924b676c-8556-4d07-bf4b-a3607f3780d1-kube-api-access-c7bj9\") pod \"certified-operators-ghptn\" (UID: \"924b676c-8556-4d07-bf4b-a3607f3780d1\") " pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.513250 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.562050 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5g7vv"] Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.570391 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.581403 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.581612 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qffs\" (UniqueName: \"kubernetes.io/projected/41421b22-7fd3-4c47-a406-1681ecda5d10-kube-api-access-9qffs\") pod \"community-operators-t6j47\" (UID: \"41421b22-7fd3-4c47-a406-1681ecda5d10\") " pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.581679 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41421b22-7fd3-4c47-a406-1681ecda5d10-catalog-content\") pod \"community-operators-t6j47\" (UID: \"41421b22-7fd3-4c47-a406-1681ecda5d10\") " pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.581751 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41421b22-7fd3-4c47-a406-1681ecda5d10-utilities\") pod \"community-operators-t6j47\" (UID: \"41421b22-7fd3-4c47-a406-1681ecda5d10\") " pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.582279 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41421b22-7fd3-4c47-a406-1681ecda5d10-utilities\") pod \"community-operators-t6j47\" (UID: \"41421b22-7fd3-4c47-a406-1681ecda5d10\") " pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:07:09 crc kubenswrapper[4781]: E1208 20:07:09.584282 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:10.084260137 +0000 UTC m=+146.235543514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.584387 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41421b22-7fd3-4c47-a406-1681ecda5d10-catalog-content\") pod \"community-operators-t6j47\" (UID: \"41421b22-7fd3-4c47-a406-1681ecda5d10\") " pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.588381 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5g7vv"] Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.628108 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qffs\" (UniqueName: \"kubernetes.io/projected/41421b22-7fd3-4c47-a406-1681ecda5d10-kube-api-access-9qffs\") pod \"community-operators-t6j47\" (UID: \"41421b22-7fd3-4c47-a406-1681ecda5d10\") " pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.682759 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxmmh\" (UniqueName: \"kubernetes.io/projected/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-kube-api-access-zxmmh\") pod \"certified-operators-5g7vv\" (UID: \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\") " pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.682845 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-utilities\") pod \"certified-operators-5g7vv\" (UID: \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\") " pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.682958 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-catalog-content\") pod \"certified-operators-5g7vv\" (UID: \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\") " pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.683001 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:09 crc kubenswrapper[4781]: E1208 20:07:09.683368 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:10.183352287 +0000 UTC m=+146.334635664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.741823 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.787540 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.788029 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxmmh\" (UniqueName: \"kubernetes.io/projected/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-kube-api-access-zxmmh\") pod \"certified-operators-5g7vv\" (UID: \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\") " pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:07:09 crc kubenswrapper[4781]: E1208 20:07:09.788043 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:10.288019603 +0000 UTC m=+146.439302990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.788395 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-utilities\") pod \"certified-operators-5g7vv\" (UID: \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\") " pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.788497 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-catalog-content\") pod \"certified-operators-5g7vv\" (UID: \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\") " pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.788539 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:09 crc kubenswrapper[4781]: E1208 20:07:09.788809 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:10.288799035 +0000 UTC m=+146.440082412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.789394 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-utilities\") pod \"certified-operators-5g7vv\" (UID: \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\") " pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.789486 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-catalog-content\") pod \"certified-operators-5g7vv\" (UID: \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\") " pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.828329 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxmmh\" (UniqueName: \"kubernetes.io/projected/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-kube-api-access-zxmmh\") pod \"certified-operators-5g7vv\" (UID: \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\") " pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.907652 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:09 crc kubenswrapper[4781]: E1208 20:07:09.908069 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:10.408050133 +0000 UTC m=+146.559333510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.940568 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d2rbv"] Dec 08 20:07:09 crc kubenswrapper[4781]: I1208 20:07:09.992216 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.009745 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:10 crc kubenswrapper[4781]: E1208 20:07:10.010104 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:10.510089835 +0000 UTC m=+146.661373212 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.062503 4781 patch_prober.go:28] interesting pod/router-default-5444994796-k7smf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 20:07:10 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 08 20:07:10 crc kubenswrapper[4781]: [+]process-running ok Dec 08 20:07:10 crc kubenswrapper[4781]: healthz check failed Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.062545 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7smf" podUID="c59dd266-8743-4fa4-8641-450ea02d8dd4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.110598 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:10 crc kubenswrapper[4781]: E1208 20:07:10.111024 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:10.610991486 +0000 UTC m=+146.762274863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.172166 4781 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-b6gxs container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.172506 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" podUID="7f2241e4-2e96-4d40-9bbf-aaae0e6948cc" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.188849 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2rbv" event={"ID":"a48498fe-4ab1-428b-bf97-d8c7fe2d002a","Type":"ContainerStarted","Data":"d7212972bf7245c6968dff754d383861707879afff40551a0be0f53ac8c38886"} Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.189555 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ghptn"] Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.211854 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:10 crc kubenswrapper[4781]: E1208 20:07:10.212264 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:10.712248977 +0000 UTC m=+146.863532354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.249186 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" event={"ID":"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba","Type":"ContainerStarted","Data":"cdb3af8561c0d5ec526a54edca102a39fe059df98f61d9cf55e20a930c7bc714"} Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.250784 4781 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-7lss8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.250817 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" podUID="959c7256-02a6-47bb-9b32-64387b359e95" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.272409 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-b6gxs" Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.312502 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:10 crc kubenswrapper[4781]: E1208 20:07:10.313800 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:10.813781745 +0000 UTC m=+146.965065112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.415093 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:10 crc kubenswrapper[4781]: E1208 20:07:10.415454 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:10.915439537 +0000 UTC m=+147.066722914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.459737 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t6j47"] Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.516217 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:10 crc kubenswrapper[4781]: E1208 20:07:10.517082 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.017061598 +0000 UTC m=+147.168344975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.560209 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5g7vv"] Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.614633 4781 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.618685 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:10 crc kubenswrapper[4781]: E1208 20:07:10.619184 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.119171673 +0000 UTC m=+147.270455050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.721470 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:10 crc kubenswrapper[4781]: E1208 20:07:10.721774 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.221752701 +0000 UTC m=+147.373036078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.722021 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:10 crc kubenswrapper[4781]: E1208 20:07:10.722345 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.222338008 +0000 UTC m=+147.373621385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.823362 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:10 crc kubenswrapper[4781]: E1208 20:07:10.823759 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.323727462 +0000 UTC m=+147.475010839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.823835 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:10 crc kubenswrapper[4781]: E1208 20:07:10.824351 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.324343519 +0000 UTC m=+147.475626886 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.925429 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:10 crc kubenswrapper[4781]: E1208 20:07:10.925591 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.425569038 +0000 UTC m=+147.576852415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:10 crc kubenswrapper[4781]: I1208 20:07:10.925847 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:10 crc kubenswrapper[4781]: E1208 20:07:10.926164 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.426153635 +0000 UTC m=+147.577437022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.027117 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:11 crc kubenswrapper[4781]: E1208 20:07:11.027345 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.527312773 +0000 UTC m=+147.678596160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.027793 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:11 crc kubenswrapper[4781]: E1208 20:07:11.028180 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.528164887 +0000 UTC m=+147.679448264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.054288 4781 patch_prober.go:28] interesting pod/router-default-5444994796-k7smf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 20:07:11 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 08 20:07:11 crc kubenswrapper[4781]: [+]process-running ok Dec 08 20:07:11 crc kubenswrapper[4781]: healthz check failed Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.054362 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7smf" podUID="c59dd266-8743-4fa4-8641-450ea02d8dd4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.123519 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vbq9n"] Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.124529 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.126183 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.128574 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.128719 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.128766 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.128793 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.128844 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:07:11 crc kubenswrapper[4781]: E1208 20:07:11.129803 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.629781707 +0000 UTC m=+147.781065094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.130145 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.134624 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.134670 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.135254 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.140498 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vbq9n"] Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.230584 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.230668 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxlvx\" (UniqueName: \"kubernetes.io/projected/ee128c5f-b1ca-4f73-b1db-d643edb27970-kube-api-access-qxlvx\") pod \"redhat-marketplace-vbq9n\" (UID: \"ee128c5f-b1ca-4f73-b1db-d643edb27970\") " pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.230751 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee128c5f-b1ca-4f73-b1db-d643edb27970-utilities\") pod \"redhat-marketplace-vbq9n\" (UID: \"ee128c5f-b1ca-4f73-b1db-d643edb27970\") " pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.230819 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee128c5f-b1ca-4f73-b1db-d643edb27970-catalog-content\") pod \"redhat-marketplace-vbq9n\" (UID: \"ee128c5f-b1ca-4f73-b1db-d643edb27970\") " pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:07:11 crc kubenswrapper[4781]: E1208 20:07:11.231217 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.731203743 +0000 UTC m=+147.882487120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dh9vd" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.255364 4781 generic.go:334] "Generic (PLEG): container finished" podID="924b676c-8556-4d07-bf4b-a3607f3780d1" containerID="61a4200ee30d459bcaa5024c2254b17dd6fe9238240ee4c60e4e0fef9540a053" exitCode=0 Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.255449 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghptn" event={"ID":"924b676c-8556-4d07-bf4b-a3607f3780d1","Type":"ContainerDied","Data":"61a4200ee30d459bcaa5024c2254b17dd6fe9238240ee4c60e4e0fef9540a053"} Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.255479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghptn" event={"ID":"924b676c-8556-4d07-bf4b-a3607f3780d1","Type":"ContainerStarted","Data":"2b47b56dca279bb707d175260eb846c6b913fa75bab8ac5530a3ae962ba3df84"} Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.257542 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.257661 4781 generic.go:334] "Generic (PLEG): container finished" podID="41421b22-7fd3-4c47-a406-1681ecda5d10" containerID="f62151b9a37307fbb0e31c11383803e6d5131578e1847382f422c264bb10a968" exitCode=0 Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.257719 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6j47" event={"ID":"41421b22-7fd3-4c47-a406-1681ecda5d10","Type":"ContainerDied","Data":"f62151b9a37307fbb0e31c11383803e6d5131578e1847382f422c264bb10a968"} Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.257743 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6j47" event={"ID":"41421b22-7fd3-4c47-a406-1681ecda5d10","Type":"ContainerStarted","Data":"9fdba995ec384d162e2b6f04530690e299e93d441dd9469410582ac0415c71c8"} Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.262834 4781 generic.go:334] "Generic (PLEG): container finished" podID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" containerID="42303b2e7bbea21dc48d531eba20b0aa2f68c084eeac93bbd228ef93031637b9" exitCode=0 Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.262893 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2rbv" event={"ID":"a48498fe-4ab1-428b-bf97-d8c7fe2d002a","Type":"ContainerDied","Data":"42303b2e7bbea21dc48d531eba20b0aa2f68c084eeac93bbd228ef93031637b9"} Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.266104 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" event={"ID":"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba","Type":"ContainerStarted","Data":"4e776be66497c4ebacd2ae02d924f5a4b56afb20ac09b455d6355c221461659c"} Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.268814 4781 generic.go:334] "Generic (PLEG): container finished" podID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" containerID="1ed647ccfe43b9bfa9a3ee7dc2fdc449128e9dafa7e4cfa389dc608644a4bfe9" exitCode=0 Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.269431 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5g7vv" event={"ID":"aef6b08e-0164-4a0b-800b-c7a61bf2e35d","Type":"ContainerDied","Data":"1ed647ccfe43b9bfa9a3ee7dc2fdc449128e9dafa7e4cfa389dc608644a4bfe9"} Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.269541 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5g7vv" event={"ID":"aef6b08e-0164-4a0b-800b-c7a61bf2e35d","Type":"ContainerStarted","Data":"2ba37bf137c56b7d571d70e72ed412f5395bc7cd8f53e92e36bc71ee97c73e43"} Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.329013 4781 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-08T20:07:10.614840901Z","Handler":null,"Name":""} Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.335344 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" podStartSLOduration=11.335323824 podStartE2EDuration="11.335323824s" podCreationTimestamp="2025-12-08 20:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:11.330041135 +0000 UTC m=+147.481324522" watchObservedRunningTime="2025-12-08 20:07:11.335323824 +0000 UTC m=+147.486607201" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.335547 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.335807 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee128c5f-b1ca-4f73-b1db-d643edb27970-utilities\") pod \"redhat-marketplace-vbq9n\" (UID: \"ee128c5f-b1ca-4f73-b1db-d643edb27970\") " pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.335861 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee128c5f-b1ca-4f73-b1db-d643edb27970-catalog-content\") pod \"redhat-marketplace-vbq9n\" (UID: \"ee128c5f-b1ca-4f73-b1db-d643edb27970\") " pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.336000 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxlvx\" (UniqueName: \"kubernetes.io/projected/ee128c5f-b1ca-4f73-b1db-d643edb27970-kube-api-access-qxlvx\") pod \"redhat-marketplace-vbq9n\" (UID: \"ee128c5f-b1ca-4f73-b1db-d643edb27970\") " pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:07:11 crc kubenswrapper[4781]: E1208 20:07:11.336834 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-08 20:07:11.836815186 +0000 UTC m=+147.988098593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.337450 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee128c5f-b1ca-4f73-b1db-d643edb27970-catalog-content\") pod \"redhat-marketplace-vbq9n\" (UID: \"ee128c5f-b1ca-4f73-b1db-d643edb27970\") " pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.337753 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee128c5f-b1ca-4f73-b1db-d643edb27970-utilities\") pod \"redhat-marketplace-vbq9n\" (UID: \"ee128c5f-b1ca-4f73-b1db-d643edb27970\") " pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.340430 4781 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.340453 4781 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.356272 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.365975 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.373703 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxlvx\" (UniqueName: \"kubernetes.io/projected/ee128c5f-b1ca-4f73-b1db-d643edb27970-kube-api-access-qxlvx\") pod \"redhat-marketplace-vbq9n\" (UID: \"ee128c5f-b1ca-4f73-b1db-d643edb27970\") " pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.374088 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.441012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.453073 4781 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.453283 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.489790 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dh9vd\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.499197 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.538613 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vwcdr"] Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.541265 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.541779 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.544640 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwcdr"] Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.647547 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560ff3a7-c3ed-4a54-80df-e7047ec1af42-catalog-content\") pod \"redhat-marketplace-vwcdr\" (UID: \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\") " pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.647644 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtqsz\" (UniqueName: \"kubernetes.io/projected/560ff3a7-c3ed-4a54-80df-e7047ec1af42-kube-api-access-mtqsz\") pod \"redhat-marketplace-vwcdr\" (UID: \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\") " pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.647668 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560ff3a7-c3ed-4a54-80df-e7047ec1af42-utilities\") pod \"redhat-marketplace-vwcdr\" (UID: \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\") " pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.652616 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.693599 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:11 crc kubenswrapper[4781]: W1208 20:07:11.694132 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-58089b580e6110158583dbb88ff976fba5ffa2deeed00887a6caebeeb6887518 WatchSource:0}: Error finding container 58089b580e6110158583dbb88ff976fba5ffa2deeed00887a6caebeeb6887518: Status 404 returned error can't find the container with id 58089b580e6110158583dbb88ff976fba5ffa2deeed00887a6caebeeb6887518 Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.749436 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560ff3a7-c3ed-4a54-80df-e7047ec1af42-utilities\") pod \"redhat-marketplace-vwcdr\" (UID: \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\") " pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.749896 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtqsz\" (UniqueName: \"kubernetes.io/projected/560ff3a7-c3ed-4a54-80df-e7047ec1af42-kube-api-access-mtqsz\") pod \"redhat-marketplace-vwcdr\" (UID: \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\") " pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.749965 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.750032 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560ff3a7-c3ed-4a54-80df-e7047ec1af42-catalog-content\") pod \"redhat-marketplace-vwcdr\" (UID: \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\") " pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.750192 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560ff3a7-c3ed-4a54-80df-e7047ec1af42-utilities\") pod \"redhat-marketplace-vwcdr\" (UID: \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\") " pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.750451 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560ff3a7-c3ed-4a54-80df-e7047ec1af42-catalog-content\") pod \"redhat-marketplace-vwcdr\" (UID: \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\") " pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.754022 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c74e396c-5b68-47be-b86b-9f48c02ec760-metrics-certs\") pod \"network-metrics-daemon-gr5xw\" (UID: \"c74e396c-5b68-47be-b86b-9f48c02ec760\") " pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.779878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtqsz\" (UniqueName: \"kubernetes.io/projected/560ff3a7-c3ed-4a54-80df-e7047ec1af42-kube-api-access-mtqsz\") pod \"redhat-marketplace-vwcdr\" (UID: \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\") " pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.783254 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.783870 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.787637 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.789242 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.793320 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.795784 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vbq9n"] Dec 08 20:07:11 crc kubenswrapper[4781]: W1208 20:07:11.841604 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee128c5f_b1ca_4f73_b1db_d643edb27970.slice/crio-48488c2255eb50992d214196850e915e8071d387b8b632e814afd048374e13d1 WatchSource:0}: Error finding container 48488c2255eb50992d214196850e915e8071d387b8b632e814afd048374e13d1: Status 404 returned error can't find the container with id 48488c2255eb50992d214196850e915e8071d387b8b632e814afd048374e13d1 Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.851399 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6e76a2f-0cd7-4c73-a806-ef653fc7916d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b6e76a2f-0cd7-4c73-a806-ef653fc7916d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.851486 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6e76a2f-0cd7-4c73-a806-ef653fc7916d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b6e76a2f-0cd7-4c73-a806-ef653fc7916d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.866310 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.952083 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6e76a2f-0cd7-4c73-a806-ef653fc7916d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b6e76a2f-0cd7-4c73-a806-ef653fc7916d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.952170 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6e76a2f-0cd7-4c73-a806-ef653fc7916d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b6e76a2f-0cd7-4c73-a806-ef653fc7916d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.952273 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6e76a2f-0cd7-4c73-a806-ef653fc7916d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b6e76a2f-0cd7-4c73-a806-ef653fc7916d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.953024 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dh9vd"] Dec 08 20:07:11 crc kubenswrapper[4781]: W1208 20:07:11.963534 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc1243b0_6d24_4282_a3e7_c1c87296ca09.slice/crio-3cadc4f31b505832a820aca930b9771faa7b6a69781ea9d86946865b7c239fc7 WatchSource:0}: Error finding container 3cadc4f31b505832a820aca930b9771faa7b6a69781ea9d86946865b7c239fc7: Status 404 returned error can't find the container with id 3cadc4f31b505832a820aca930b9771faa7b6a69781ea9d86946865b7c239fc7 Dec 08 20:07:11 crc kubenswrapper[4781]: I1208 20:07:11.974513 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6e76a2f-0cd7-4c73-a806-ef653fc7916d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b6e76a2f-0cd7-4c73-a806-ef653fc7916d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.039116 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gr5xw" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.053905 4781 patch_prober.go:28] interesting pod/router-default-5444994796-k7smf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 20:07:12 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 08 20:07:12 crc kubenswrapper[4781]: [+]process-running ok Dec 08 20:07:12 crc kubenswrapper[4781]: healthz check failed Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.054000 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7smf" podUID="c59dd266-8743-4fa4-8641-450ea02d8dd4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.144910 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.145768 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lhcz4"] Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.146726 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhcz4"] Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.146842 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.149530 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwcdr"] Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.149794 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.168894 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.256191 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e09d1ea-0001-4821-9c36-20ec4618fcfc-catalog-content\") pod \"redhat-operators-lhcz4\" (UID: \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\") " pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.256248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e09d1ea-0001-4821-9c36-20ec4618fcfc-utilities\") pod \"redhat-operators-lhcz4\" (UID: \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\") " pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.256279 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7snz6\" (UniqueName: \"kubernetes.io/projected/2e09d1ea-0001-4821-9c36-20ec4618fcfc-kube-api-access-7snz6\") pod \"redhat-operators-lhcz4\" (UID: \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\") " pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.314986 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwcdr" event={"ID":"560ff3a7-c3ed-4a54-80df-e7047ec1af42","Type":"ContainerStarted","Data":"7500b80a466e1f10e1226b4fd1b9ce4f376093cce3f8938a04b4ec9eca59292e"} Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.317467 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" event={"ID":"bc1243b0-6d24-4282-a3e7-c1c87296ca09","Type":"ContainerStarted","Data":"df58bed99f0dfb333830e9813ce600d86742c49e8ed3142b9c1974ed5e17666d"} Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.317492 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" event={"ID":"bc1243b0-6d24-4282-a3e7-c1c87296ca09","Type":"ContainerStarted","Data":"3cadc4f31b505832a820aca930b9771faa7b6a69781ea9d86946865b7c239fc7"} Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.319711 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b831d4e9475e935f73ca123e839c1e60097ff73f858cdf3bbc0391f659532a93"} Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.319794 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"49be117f9a5b8934f08254d92aef25f9e3a1a49311af5eeb98676dfc792bc782"} Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.335954 4781 generic.go:334] "Generic (PLEG): container finished" podID="23d965c6-71e4-4594-a70e-aad1e2b24c3f" containerID="ca17385bea9f01b4b036c14fe4357bd0b051eacdb6cbf7ecf3be6e1db5d6016b" exitCode=0 Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.336052 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" event={"ID":"23d965c6-71e4-4594-a70e-aad1e2b24c3f","Type":"ContainerDied","Data":"ca17385bea9f01b4b036c14fe4357bd0b051eacdb6cbf7ecf3be6e1db5d6016b"} Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.346530 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-dfxlz container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.346632 4781 patch_prober.go:28] interesting pod/downloads-7954f5f757-dfxlz container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.346585 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dfxlz" podUID="21854278-8a87-40ee-9209-509132febd54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.346678 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dfxlz" podUID="21854278-8a87-40ee-9209-509132febd54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.348247 4781 generic.go:334] "Generic (PLEG): container finished" podID="ee128c5f-b1ca-4f73-b1db-d643edb27970" containerID="c6c7a1962abecdf98f55f0b6fa87f1e13732977ccd5de7fc8f3f2f592b2b6a95" exitCode=0 Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.348307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vbq9n" event={"ID":"ee128c5f-b1ca-4f73-b1db-d643edb27970","Type":"ContainerDied","Data":"c6c7a1962abecdf98f55f0b6fa87f1e13732977ccd5de7fc8f3f2f592b2b6a95"} Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.348336 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vbq9n" event={"ID":"ee128c5f-b1ca-4f73-b1db-d643edb27970","Type":"ContainerStarted","Data":"48488c2255eb50992d214196850e915e8071d387b8b632e814afd048374e13d1"} Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.358010 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e09d1ea-0001-4821-9c36-20ec4618fcfc-catalog-content\") pod \"redhat-operators-lhcz4\" (UID: \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\") " pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.358062 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e09d1ea-0001-4821-9c36-20ec4618fcfc-utilities\") pod \"redhat-operators-lhcz4\" (UID: \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\") " pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.358081 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7snz6\" (UniqueName: \"kubernetes.io/projected/2e09d1ea-0001-4821-9c36-20ec4618fcfc-kube-api-access-7snz6\") pod \"redhat-operators-lhcz4\" (UID: \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\") " pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.359025 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e09d1ea-0001-4821-9c36-20ec4618fcfc-utilities\") pod \"redhat-operators-lhcz4\" (UID: \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\") " pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.359273 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e09d1ea-0001-4821-9c36-20ec4618fcfc-catalog-content\") pod \"redhat-operators-lhcz4\" (UID: \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\") " pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.363241 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dx6bv" event={"ID":"98cdcf53-4bbb-4c2b-8fef-e5f3983d03ba","Type":"ContainerStarted","Data":"da5e5e3af0cc1807c66b493f60149e74f201f70f3ff1af3af0b55af831ea07ab"} Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.374889 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b50a615437900fa020d3cc860b6e30a31ef214277d68d156ac4c535548576954"} Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.374940 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c06838bfb802a1a8c2b5c4a79b7c2eeb14ebe6e954853676a78c9dad23b58691"} Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.375461 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.379186 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2b1701e96f1ebe6a169dc939d134134787e715806828110191c90e146d90aa0e"} Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.379217 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"58089b580e6110158583dbb88ff976fba5ffa2deeed00887a6caebeeb6887518"} Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.382889 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7snz6\" (UniqueName: \"kubernetes.io/projected/2e09d1ea-0001-4821-9c36-20ec4618fcfc-kube-api-access-7snz6\") pod \"redhat-operators-lhcz4\" (UID: \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\") " pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.416429 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.416887 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.423932 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.468410 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.533947 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lb745"] Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.535880 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.552992 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lb745"] Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.556758 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gr5xw"] Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.653614 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.661758 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca98c5df-cacc-4016-a941-f02ebe37bc01-catalog-content\") pod \"redhat-operators-lb745\" (UID: \"ca98c5df-cacc-4016-a941-f02ebe37bc01\") " pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.661821 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8jgk\" (UniqueName: \"kubernetes.io/projected/ca98c5df-cacc-4016-a941-f02ebe37bc01-kube-api-access-w8jgk\") pod \"redhat-operators-lb745\" (UID: \"ca98c5df-cacc-4016-a941-f02ebe37bc01\") " pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.661848 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca98c5df-cacc-4016-a941-f02ebe37bc01-utilities\") pod \"redhat-operators-lb745\" (UID: \"ca98c5df-cacc-4016-a941-f02ebe37bc01\") " pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.666540 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.666581 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.685525 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.686142 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.697541 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.716250 4781 patch_prober.go:28] interesting pod/console-f9d7485db-tmmk6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.716308 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-tmmk6" podUID="5669be4d-29d4-4cee-ad63-75f37e3727d2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.24:8443/health\": dial tcp 10.217.0.24:8443: connect: connection refused" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.763097 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca98c5df-cacc-4016-a941-f02ebe37bc01-utilities\") pod \"redhat-operators-lb745\" (UID: \"ca98c5df-cacc-4016-a941-f02ebe37bc01\") " pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.763465 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca98c5df-cacc-4016-a941-f02ebe37bc01-catalog-content\") pod \"redhat-operators-lb745\" (UID: \"ca98c5df-cacc-4016-a941-f02ebe37bc01\") " pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.763530 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8jgk\" (UniqueName: \"kubernetes.io/projected/ca98c5df-cacc-4016-a941-f02ebe37bc01-kube-api-access-w8jgk\") pod \"redhat-operators-lb745\" (UID: \"ca98c5df-cacc-4016-a941-f02ebe37bc01\") " pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.764763 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca98c5df-cacc-4016-a941-f02ebe37bc01-utilities\") pod \"redhat-operators-lb745\" (UID: \"ca98c5df-cacc-4016-a941-f02ebe37bc01\") " pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.766075 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca98c5df-cacc-4016-a941-f02ebe37bc01-catalog-content\") pod \"redhat-operators-lb745\" (UID: \"ca98c5df-cacc-4016-a941-f02ebe37bc01\") " pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.793589 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8jgk\" (UniqueName: \"kubernetes.io/projected/ca98c5df-cacc-4016-a941-f02ebe37bc01-kube-api-access-w8jgk\") pod \"redhat-operators-lb745\" (UID: \"ca98c5df-cacc-4016-a941-f02ebe37bc01\") " pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.796690 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhcz4"] Dec 08 20:07:12 crc kubenswrapper[4781]: W1208 20:07:12.812700 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e09d1ea_0001_4821_9c36_20ec4618fcfc.slice/crio-92da0ce16ba55207085643d4a1aae953b1ebece6b30b7a23e61228de2191543f WatchSource:0}: Error finding container 92da0ce16ba55207085643d4a1aae953b1ebece6b30b7a23e61228de2191543f: Status 404 returned error can't find the container with id 92da0ce16ba55207085643d4a1aae953b1ebece6b30b7a23e61228de2191543f Dec 08 20:07:12 crc kubenswrapper[4781]: I1208 20:07:12.896287 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.058311 4781 patch_prober.go:28] interesting pod/router-default-5444994796-k7smf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 20:07:13 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 08 20:07:13 crc kubenswrapper[4781]: [+]process-running ok Dec 08 20:07:13 crc kubenswrapper[4781]: healthz check failed Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.058593 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7smf" podUID="c59dd266-8743-4fa4-8641-450ea02d8dd4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.180569 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lb745"] Dec 08 20:07:13 crc kubenswrapper[4781]: W1208 20:07:13.224476 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca98c5df_cacc_4016_a941_f02ebe37bc01.slice/crio-e827a980b2a1d3eab4f6da265ec3e6d22a9b22f3961885d9a5429017f0d5755d WatchSource:0}: Error finding container e827a980b2a1d3eab4f6da265ec3e6d22a9b22f3961885d9a5429017f0d5755d: Status 404 returned error can't find the container with id e827a980b2a1d3eab4f6da265ec3e6d22a9b22f3961885d9a5429017f0d5755d Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.397269 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b6e76a2f-0cd7-4c73-a806-ef653fc7916d","Type":"ContainerStarted","Data":"8b4dbd071b0402a8a721a3fe2be970bca28ebc63decfafb834f0002e464f5c56"} Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.397612 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b6e76a2f-0cd7-4c73-a806-ef653fc7916d","Type":"ContainerStarted","Data":"9edcf7aaaabfe0f0d4c659817379a291a547ffb5a38b6e4679b5f2ddf81c5aea"} Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.400342 4781 generic.go:334] "Generic (PLEG): container finished" podID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" containerID="43250e1ee58029e075f3ad79489740149d5d7ed7de343e5f755b8ae783b352ad" exitCode=0 Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.400522 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhcz4" event={"ID":"2e09d1ea-0001-4821-9c36-20ec4618fcfc","Type":"ContainerDied","Data":"43250e1ee58029e075f3ad79489740149d5d7ed7de343e5f755b8ae783b352ad"} Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.400574 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhcz4" event={"ID":"2e09d1ea-0001-4821-9c36-20ec4618fcfc","Type":"ContainerStarted","Data":"92da0ce16ba55207085643d4a1aae953b1ebece6b30b7a23e61228de2191543f"} Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.410138 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" event={"ID":"c74e396c-5b68-47be-b86b-9f48c02ec760","Type":"ContainerStarted","Data":"53037470356fadea330c5588eef4f577cdc3fa77d013ee86a6bcaafaba0c73bf"} Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.410195 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" event={"ID":"c74e396c-5b68-47be-b86b-9f48c02ec760","Type":"ContainerStarted","Data":"c74983a490330b6a72447b1657b77ff275066964848cd906eb9d45e67155efe1"} Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.417963 4781 generic.go:334] "Generic (PLEG): container finished" podID="560ff3a7-c3ed-4a54-80df-e7047ec1af42" containerID="188ff3e4b08db8d576b25da0e5cfcbfd6d16504d456253fc9a72eff9b4ee2ed3" exitCode=0 Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.418020 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwcdr" event={"ID":"560ff3a7-c3ed-4a54-80df-e7047ec1af42","Type":"ContainerDied","Data":"188ff3e4b08db8d576b25da0e5cfcbfd6d16504d456253fc9a72eff9b4ee2ed3"} Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.425441 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb745" event={"ID":"ca98c5df-cacc-4016-a941-f02ebe37bc01","Type":"ContainerStarted","Data":"e827a980b2a1d3eab4f6da265ec3e6d22a9b22f3961885d9a5429017f0d5755d"} Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.434529 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.434510073 podStartE2EDuration="2.434510073s" podCreationTimestamp="2025-12-08 20:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:13.414900631 +0000 UTC m=+149.566184008" watchObservedRunningTime="2025-12-08 20:07:13.434510073 +0000 UTC m=+149.585793450" Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.437880 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-vq67l" Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.438752 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r84kt" Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.461065 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" podStartSLOduration=129.46105072 podStartE2EDuration="2m9.46105072s" podCreationTimestamp="2025-12-08 20:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:13.459749274 +0000 UTC m=+149.611032651" watchObservedRunningTime="2025-12-08 20:07:13.46105072 +0000 UTC m=+149.612334097" Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.940438 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:07:13 crc kubenswrapper[4781]: I1208 20:07:13.970522 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.052447 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.061200 4781 patch_prober.go:28] interesting pod/router-default-5444994796-k7smf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 20:07:14 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 08 20:07:14 crc kubenswrapper[4781]: [+]process-running ok Dec 08 20:07:14 crc kubenswrapper[4781]: healthz check failed Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.061261 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7smf" podUID="c59dd266-8743-4fa4-8641-450ea02d8dd4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.087928 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23d965c6-71e4-4594-a70e-aad1e2b24c3f-config-volume\") pod \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\" (UID: \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\") " Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.088224 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v5qp\" (UniqueName: \"kubernetes.io/projected/23d965c6-71e4-4594-a70e-aad1e2b24c3f-kube-api-access-9v5qp\") pod \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\" (UID: \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\") " Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.088288 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23d965c6-71e4-4594-a70e-aad1e2b24c3f-secret-volume\") pod \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\" (UID: \"23d965c6-71e4-4594-a70e-aad1e2b24c3f\") " Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.089168 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23d965c6-71e4-4594-a70e-aad1e2b24c3f-config-volume" (OuterVolumeSpecName: "config-volume") pod "23d965c6-71e4-4594-a70e-aad1e2b24c3f" (UID: "23d965c6-71e4-4594-a70e-aad1e2b24c3f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.096743 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23d965c6-71e4-4594-a70e-aad1e2b24c3f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "23d965c6-71e4-4594-a70e-aad1e2b24c3f" (UID: "23d965c6-71e4-4594-a70e-aad1e2b24c3f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.129059 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d965c6-71e4-4594-a70e-aad1e2b24c3f-kube-api-access-9v5qp" (OuterVolumeSpecName: "kube-api-access-9v5qp") pod "23d965c6-71e4-4594-a70e-aad1e2b24c3f" (UID: "23d965c6-71e4-4594-a70e-aad1e2b24c3f"). InnerVolumeSpecName "kube-api-access-9v5qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.154400 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.189441 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v5qp\" (UniqueName: \"kubernetes.io/projected/23d965c6-71e4-4594-a70e-aad1e2b24c3f-kube-api-access-9v5qp\") on node \"crc\" DevicePath \"\"" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.189741 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/23d965c6-71e4-4594-a70e-aad1e2b24c3f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.189751 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/23d965c6-71e4-4594-a70e-aad1e2b24c3f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.462126 4781 generic.go:334] "Generic (PLEG): container finished" podID="ca98c5df-cacc-4016-a941-f02ebe37bc01" containerID="9c58816d18c0a186c778bdca85e30960855616475c645145cdd058adc5907af6" exitCode=0 Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.462210 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb745" event={"ID":"ca98c5df-cacc-4016-a941-f02ebe37bc01","Type":"ContainerDied","Data":"9c58816d18c0a186c778bdca85e30960855616475c645145cdd058adc5907af6"} Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.474203 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" event={"ID":"23d965c6-71e4-4594-a70e-aad1e2b24c3f","Type":"ContainerDied","Data":"418af1b754e2c60126eb77b32c170b70ae47a4d3df3c4addd88cfb47b729118e"} Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.474241 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="418af1b754e2c60126eb77b32c170b70ae47a4d3df3c4addd88cfb47b729118e" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.474308 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.497194 4781 generic.go:334] "Generic (PLEG): container finished" podID="b6e76a2f-0cd7-4c73-a806-ef653fc7916d" containerID="8b4dbd071b0402a8a721a3fe2be970bca28ebc63decfafb834f0002e464f5c56" exitCode=0 Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.497299 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b6e76a2f-0cd7-4c73-a806-ef653fc7916d","Type":"ContainerDied","Data":"8b4dbd071b0402a8a721a3fe2be970bca28ebc63decfafb834f0002e464f5c56"} Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.502211 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gr5xw" event={"ID":"c74e396c-5b68-47be-b86b-9f48c02ec760","Type":"ContainerStarted","Data":"6e3ee9d2f5f12a9fdf0371c6859b32a4f6ee62050e187da8b6799b71520176b4"} Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.547444 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gr5xw" podStartSLOduration=131.547421774 podStartE2EDuration="2m11.547421774s" podCreationTimestamp="2025-12-08 20:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:14.543691219 +0000 UTC m=+150.694974596" watchObservedRunningTime="2025-12-08 20:07:14.547421774 +0000 UTC m=+150.698705151" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.915790 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 08 20:07:14 crc kubenswrapper[4781]: E1208 20:07:14.918299 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d965c6-71e4-4594-a70e-aad1e2b24c3f" containerName="collect-profiles" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.918342 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d965c6-71e4-4594-a70e-aad1e2b24c3f" containerName="collect-profiles" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.918527 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="23d965c6-71e4-4594-a70e-aad1e2b24c3f" containerName="collect-profiles" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.928237 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.928459 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.934888 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 08 20:07:14 crc kubenswrapper[4781]: I1208 20:07:14.935098 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 08 20:07:15 crc kubenswrapper[4781]: I1208 20:07:15.054186 4781 patch_prober.go:28] interesting pod/router-default-5444994796-k7smf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 20:07:15 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 08 20:07:15 crc kubenswrapper[4781]: [+]process-running ok Dec 08 20:07:15 crc kubenswrapper[4781]: healthz check failed Dec 08 20:07:15 crc kubenswrapper[4781]: I1208 20:07:15.054255 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7smf" podUID="c59dd266-8743-4fa4-8641-450ea02d8dd4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 20:07:15 crc kubenswrapper[4781]: I1208 20:07:15.105997 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a73795ed-e585-49bd-b50c-4a15427e7997-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a73795ed-e585-49bd-b50c-4a15427e7997\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 20:07:15 crc kubenswrapper[4781]: I1208 20:07:15.106053 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a73795ed-e585-49bd-b50c-4a15427e7997-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a73795ed-e585-49bd-b50c-4a15427e7997\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 20:07:15 crc kubenswrapper[4781]: I1208 20:07:15.213099 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a73795ed-e585-49bd-b50c-4a15427e7997-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a73795ed-e585-49bd-b50c-4a15427e7997\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 20:07:15 crc kubenswrapper[4781]: I1208 20:07:15.214025 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a73795ed-e585-49bd-b50c-4a15427e7997-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a73795ed-e585-49bd-b50c-4a15427e7997\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 20:07:15 crc kubenswrapper[4781]: I1208 20:07:15.214060 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a73795ed-e585-49bd-b50c-4a15427e7997-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a73795ed-e585-49bd-b50c-4a15427e7997\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 20:07:15 crc kubenswrapper[4781]: I1208 20:07:15.235961 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a73795ed-e585-49bd-b50c-4a15427e7997-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a73795ed-e585-49bd-b50c-4a15427e7997\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 20:07:15 crc kubenswrapper[4781]: I1208 20:07:15.253972 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 20:07:15 crc kubenswrapper[4781]: I1208 20:07:15.512140 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.060271 4781 patch_prober.go:28] interesting pod/router-default-5444994796-k7smf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 20:07:16 crc kubenswrapper[4781]: [-]has-synced failed: reason withheld Dec 08 20:07:16 crc kubenswrapper[4781]: [+]process-running ok Dec 08 20:07:16 crc kubenswrapper[4781]: healthz check failed Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.060523 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7smf" podUID="c59dd266-8743-4fa4-8641-450ea02d8dd4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.086487 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.244066 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6e76a2f-0cd7-4c73-a806-ef653fc7916d-kubelet-dir\") pod \"b6e76a2f-0cd7-4c73-a806-ef653fc7916d\" (UID: \"b6e76a2f-0cd7-4c73-a806-ef653fc7916d\") " Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.244431 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6e76a2f-0cd7-4c73-a806-ef653fc7916d-kube-api-access\") pod \"b6e76a2f-0cd7-4c73-a806-ef653fc7916d\" (UID: \"b6e76a2f-0cd7-4c73-a806-ef653fc7916d\") " Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.245705 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6e76a2f-0cd7-4c73-a806-ef653fc7916d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b6e76a2f-0cd7-4c73-a806-ef653fc7916d" (UID: "b6e76a2f-0cd7-4c73-a806-ef653fc7916d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.255793 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e76a2f-0cd7-4c73-a806-ef653fc7916d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b6e76a2f-0cd7-4c73-a806-ef653fc7916d" (UID: "b6e76a2f-0cd7-4c73-a806-ef653fc7916d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.346623 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6e76a2f-0cd7-4c73-a806-ef653fc7916d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.346676 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6e76a2f-0cd7-4c73-a806-ef653fc7916d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.580151 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b6e76a2f-0cd7-4c73-a806-ef653fc7916d","Type":"ContainerDied","Data":"9edcf7aaaabfe0f0d4c659817379a291a547ffb5a38b6e4679b5f2ddf81c5aea"} Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.580204 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9edcf7aaaabfe0f0d4c659817379a291a547ffb5a38b6e4679b5f2ddf81c5aea" Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.580224 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.584617 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a73795ed-e585-49bd-b50c-4a15427e7997","Type":"ContainerStarted","Data":"70f2ede0a8eccc76266b7154fd2d40593c3e1c15925102c36738aef632d20b84"} Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.584653 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a73795ed-e585-49bd-b50c-4a15427e7997","Type":"ContainerStarted","Data":"0a5f33f447668da488646566286e4ba5eca58ce820218c29be492b7c20915eca"} Dec 08 20:07:16 crc kubenswrapper[4781]: I1208 20:07:16.599542 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.599524658 podStartE2EDuration="2.599524658s" podCreationTimestamp="2025-12-08 20:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:16.599059105 +0000 UTC m=+152.750342502" watchObservedRunningTime="2025-12-08 20:07:16.599524658 +0000 UTC m=+152.750808035" Dec 08 20:07:17 crc kubenswrapper[4781]: I1208 20:07:17.057212 4781 patch_prober.go:28] interesting pod/router-default-5444994796-k7smf container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 08 20:07:17 crc kubenswrapper[4781]: [+]has-synced ok Dec 08 20:07:17 crc kubenswrapper[4781]: [+]process-running ok Dec 08 20:07:17 crc kubenswrapper[4781]: healthz check failed Dec 08 20:07:17 crc kubenswrapper[4781]: I1208 20:07:17.057283 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k7smf" podUID="c59dd266-8743-4fa4-8641-450ea02d8dd4" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 08 20:07:17 crc kubenswrapper[4781]: I1208 20:07:17.600203 4781 generic.go:334] "Generic (PLEG): container finished" podID="a73795ed-e585-49bd-b50c-4a15427e7997" containerID="70f2ede0a8eccc76266b7154fd2d40593c3e1c15925102c36738aef632d20b84" exitCode=0 Dec 08 20:07:17 crc kubenswrapper[4781]: I1208 20:07:17.600244 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a73795ed-e585-49bd-b50c-4a15427e7997","Type":"ContainerDied","Data":"70f2ede0a8eccc76266b7154fd2d40593c3e1c15925102c36738aef632d20b84"} Dec 08 20:07:18 crc kubenswrapper[4781]: I1208 20:07:18.054016 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:18 crc kubenswrapper[4781]: I1208 20:07:18.056577 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-k7smf" Dec 08 20:07:18 crc kubenswrapper[4781]: I1208 20:07:18.916649 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nxdff" Dec 08 20:07:19 crc kubenswrapper[4781]: I1208 20:07:19.023465 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 20:07:19 crc kubenswrapper[4781]: I1208 20:07:19.203602 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a73795ed-e585-49bd-b50c-4a15427e7997-kubelet-dir\") pod \"a73795ed-e585-49bd-b50c-4a15427e7997\" (UID: \"a73795ed-e585-49bd-b50c-4a15427e7997\") " Dec 08 20:07:19 crc kubenswrapper[4781]: I1208 20:07:19.204587 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a73795ed-e585-49bd-b50c-4a15427e7997-kube-api-access\") pod \"a73795ed-e585-49bd-b50c-4a15427e7997\" (UID: \"a73795ed-e585-49bd-b50c-4a15427e7997\") " Dec 08 20:07:19 crc kubenswrapper[4781]: I1208 20:07:19.205121 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a73795ed-e585-49bd-b50c-4a15427e7997-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a73795ed-e585-49bd-b50c-4a15427e7997" (UID: "a73795ed-e585-49bd-b50c-4a15427e7997"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:07:19 crc kubenswrapper[4781]: I1208 20:07:19.231999 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a73795ed-e585-49bd-b50c-4a15427e7997-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a73795ed-e585-49bd-b50c-4a15427e7997" (UID: "a73795ed-e585-49bd-b50c-4a15427e7997"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:07:19 crc kubenswrapper[4781]: I1208 20:07:19.305811 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a73795ed-e585-49bd-b50c-4a15427e7997-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 20:07:19 crc kubenswrapper[4781]: I1208 20:07:19.306198 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a73795ed-e585-49bd-b50c-4a15427e7997-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 08 20:07:19 crc kubenswrapper[4781]: I1208 20:07:19.648206 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a73795ed-e585-49bd-b50c-4a15427e7997","Type":"ContainerDied","Data":"0a5f33f447668da488646566286e4ba5eca58ce820218c29be492b7c20915eca"} Dec 08 20:07:19 crc kubenswrapper[4781]: I1208 20:07:19.648252 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a5f33f447668da488646566286e4ba5eca58ce820218c29be492b7c20915eca" Dec 08 20:07:19 crc kubenswrapper[4781]: I1208 20:07:19.648252 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 08 20:07:21 crc kubenswrapper[4781]: I1208 20:07:21.694410 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:22 crc kubenswrapper[4781]: I1208 20:07:22.364564 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dfxlz" Dec 08 20:07:22 crc kubenswrapper[4781]: I1208 20:07:22.688731 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:22 crc kubenswrapper[4781]: I1208 20:07:22.693369 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:07:29 crc kubenswrapper[4781]: I1208 20:07:29.947759 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:07:29 crc kubenswrapper[4781]: I1208 20:07:29.948456 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:07:31 crc kubenswrapper[4781]: I1208 20:07:31.699557 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:07:40 crc kubenswrapper[4781]: E1208 20:07:40.908002 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 08 20:07:40 crc kubenswrapper[4781]: E1208 20:07:40.908792 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qxlvx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vbq9n_openshift-marketplace(ee128c5f-b1ca-4f73-b1db-d643edb27970): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 20:07:40 crc kubenswrapper[4781]: E1208 20:07:40.910230 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vbq9n" podUID="ee128c5f-b1ca-4f73-b1db-d643edb27970" Dec 08 20:07:41 crc kubenswrapper[4781]: E1208 20:07:41.748122 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 08 20:07:41 crc kubenswrapper[4781]: E1208 20:07:41.748301 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w8jgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lb745_openshift-marketplace(ca98c5df-cacc-4016-a941-f02ebe37bc01): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 20:07:41 crc kubenswrapper[4781]: E1208 20:07:41.749730 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lb745" podUID="ca98c5df-cacc-4016-a941-f02ebe37bc01" Dec 08 20:07:43 crc kubenswrapper[4781]: E1208 20:07:43.249593 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vbq9n" podUID="ee128c5f-b1ca-4f73-b1db-d643edb27970" Dec 08 20:07:43 crc kubenswrapper[4781]: E1208 20:07:43.249720 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lb745" podUID="ca98c5df-cacc-4016-a941-f02ebe37bc01" Dec 08 20:07:43 crc kubenswrapper[4781]: E1208 20:07:43.316991 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 08 20:07:43 crc kubenswrapper[4781]: E1208 20:07:43.317150 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zxmmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5g7vv_openshift-marketplace(aef6b08e-0164-4a0b-800b-c7a61bf2e35d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 20:07:43 crc kubenswrapper[4781]: E1208 20:07:43.318338 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5g7vv" podUID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" Dec 08 20:07:44 crc kubenswrapper[4781]: I1208 20:07:44.116732 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xv7vb" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.606714 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5g7vv" podUID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.684623 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.684769 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dcgb5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-d2rbv_openshift-marketplace(a48498fe-4ab1-428b-bf97-d8c7fe2d002a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.685950 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-d2rbv" podUID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.689994 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.690227 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7snz6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lhcz4_openshift-marketplace(2e09d1ea-0001-4821-9c36-20ec4618fcfc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.691481 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lhcz4" podUID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.710531 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.710608 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.710722 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9qffs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-t6j47_openshift-marketplace(41421b22-7fd3-4c47-a406-1681ecda5d10): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.711185 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c7bj9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-ghptn_openshift-marketplace(924b676c-8556-4d07-bf4b-a3607f3780d1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.712812 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-ghptn" podUID="924b676c-8556-4d07-bf4b-a3607f3780d1" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.712871 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-t6j47" podUID="41421b22-7fd3-4c47-a406-1681ecda5d10" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.794067 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-d2rbv" podUID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.794340 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-ghptn" podUID="924b676c-8556-4d07-bf4b-a3607f3780d1" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.795332 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-t6j47" podUID="41421b22-7fd3-4c47-a406-1681ecda5d10" Dec 08 20:07:44 crc kubenswrapper[4781]: E1208 20:07:44.795458 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lhcz4" podUID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" Dec 08 20:07:45 crc kubenswrapper[4781]: I1208 20:07:45.799629 4781 generic.go:334] "Generic (PLEG): container finished" podID="560ff3a7-c3ed-4a54-80df-e7047ec1af42" containerID="beb674ab211c50c911b33d7c77b71125e03a98d46c693a21e2df2055fe18f077" exitCode=0 Dec 08 20:07:45 crc kubenswrapper[4781]: I1208 20:07:45.799692 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwcdr" event={"ID":"560ff3a7-c3ed-4a54-80df-e7047ec1af42","Type":"ContainerDied","Data":"beb674ab211c50c911b33d7c77b71125e03a98d46c693a21e2df2055fe18f077"} Dec 08 20:07:46 crc kubenswrapper[4781]: I1208 20:07:46.806475 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwcdr" event={"ID":"560ff3a7-c3ed-4a54-80df-e7047ec1af42","Type":"ContainerStarted","Data":"afc1b421422232193409d48b3890e5c1b539d483edf49f33f7d762b60ac7ba5a"} Dec 08 20:07:46 crc kubenswrapper[4781]: I1208 20:07:46.823973 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vwcdr" podStartSLOduration=3.034408821 podStartE2EDuration="35.823952219s" podCreationTimestamp="2025-12-08 20:07:11 +0000 UTC" firstStartedPulling="2025-12-08 20:07:13.42553221 +0000 UTC m=+149.576815587" lastFinishedPulling="2025-12-08 20:07:46.215075598 +0000 UTC m=+182.366358985" observedRunningTime="2025-12-08 20:07:46.820582534 +0000 UTC m=+182.971865921" watchObservedRunningTime="2025-12-08 20:07:46.823952219 +0000 UTC m=+182.975235596" Dec 08 20:07:47 crc kubenswrapper[4781]: I1208 20:07:47.886342 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 08 20:07:47 crc kubenswrapper[4781]: E1208 20:07:47.887129 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73795ed-e585-49bd-b50c-4a15427e7997" containerName="pruner" Dec 08 20:07:47 crc kubenswrapper[4781]: I1208 20:07:47.887146 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73795ed-e585-49bd-b50c-4a15427e7997" containerName="pruner" Dec 08 20:07:47 crc kubenswrapper[4781]: E1208 20:07:47.887163 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e76a2f-0cd7-4c73-a806-ef653fc7916d" containerName="pruner" Dec 08 20:07:47 crc kubenswrapper[4781]: I1208 20:07:47.887170 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e76a2f-0cd7-4c73-a806-ef653fc7916d" containerName="pruner" Dec 08 20:07:47 crc kubenswrapper[4781]: I1208 20:07:47.887280 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e76a2f-0cd7-4c73-a806-ef653fc7916d" containerName="pruner" Dec 08 20:07:47 crc kubenswrapper[4781]: I1208 20:07:47.887298 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a73795ed-e585-49bd-b50c-4a15427e7997" containerName="pruner" Dec 08 20:07:47 crc kubenswrapper[4781]: I1208 20:07:47.887771 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 20:07:47 crc kubenswrapper[4781]: I1208 20:07:47.890062 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 08 20:07:47 crc kubenswrapper[4781]: I1208 20:07:47.892270 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 08 20:07:47 crc kubenswrapper[4781]: I1208 20:07:47.892523 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 08 20:07:47 crc kubenswrapper[4781]: I1208 20:07:47.904191 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0a28822-e149-4041-a8ed-d7afdb738d74-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d0a28822-e149-4041-a8ed-d7afdb738d74\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 20:07:47 crc kubenswrapper[4781]: I1208 20:07:47.904236 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0a28822-e149-4041-a8ed-d7afdb738d74-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d0a28822-e149-4041-a8ed-d7afdb738d74\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 20:07:48 crc kubenswrapper[4781]: I1208 20:07:48.005249 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0a28822-e149-4041-a8ed-d7afdb738d74-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d0a28822-e149-4041-a8ed-d7afdb738d74\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 20:07:48 crc kubenswrapper[4781]: I1208 20:07:48.005295 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0a28822-e149-4041-a8ed-d7afdb738d74-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d0a28822-e149-4041-a8ed-d7afdb738d74\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 20:07:48 crc kubenswrapper[4781]: I1208 20:07:48.005411 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0a28822-e149-4041-a8ed-d7afdb738d74-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d0a28822-e149-4041-a8ed-d7afdb738d74\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 20:07:48 crc kubenswrapper[4781]: I1208 20:07:48.027651 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0a28822-e149-4041-a8ed-d7afdb738d74-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d0a28822-e149-4041-a8ed-d7afdb738d74\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 20:07:48 crc kubenswrapper[4781]: I1208 20:07:48.218413 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 20:07:48 crc kubenswrapper[4781]: I1208 20:07:48.417211 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 08 20:07:48 crc kubenswrapper[4781]: W1208 20:07:48.428093 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd0a28822_e149_4041_a8ed_d7afdb738d74.slice/crio-cda64b738ff7ec7ca594cd589fa758bf18ec370eba24d4a9926bef63a154015b WatchSource:0}: Error finding container cda64b738ff7ec7ca594cd589fa758bf18ec370eba24d4a9926bef63a154015b: Status 404 returned error can't find the container with id cda64b738ff7ec7ca594cd589fa758bf18ec370eba24d4a9926bef63a154015b Dec 08 20:07:48 crc kubenswrapper[4781]: I1208 20:07:48.824349 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d0a28822-e149-4041-a8ed-d7afdb738d74","Type":"ContainerStarted","Data":"d4680a07382f91802254d85eb519d2aa8504616205861056c5efe1a04456e816"} Dec 08 20:07:48 crc kubenswrapper[4781]: I1208 20:07:48.824879 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d0a28822-e149-4041-a8ed-d7afdb738d74","Type":"ContainerStarted","Data":"cda64b738ff7ec7ca594cd589fa758bf18ec370eba24d4a9926bef63a154015b"} Dec 08 20:07:48 crc kubenswrapper[4781]: I1208 20:07:48.837057 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.837041405 podStartE2EDuration="1.837041405s" podCreationTimestamp="2025-12-08 20:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:48.836487459 +0000 UTC m=+184.987770836" watchObservedRunningTime="2025-12-08 20:07:48.837041405 +0000 UTC m=+184.988324782" Dec 08 20:07:49 crc kubenswrapper[4781]: I1208 20:07:49.830976 4781 generic.go:334] "Generic (PLEG): container finished" podID="d0a28822-e149-4041-a8ed-d7afdb738d74" containerID="d4680a07382f91802254d85eb519d2aa8504616205861056c5efe1a04456e816" exitCode=0 Dec 08 20:07:49 crc kubenswrapper[4781]: I1208 20:07:49.831054 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d0a28822-e149-4041-a8ed-d7afdb738d74","Type":"ContainerDied","Data":"d4680a07382f91802254d85eb519d2aa8504616205861056c5efe1a04456e816"} Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.026529 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.147907 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0a28822-e149-4041-a8ed-d7afdb738d74-kubelet-dir\") pod \"d0a28822-e149-4041-a8ed-d7afdb738d74\" (UID: \"d0a28822-e149-4041-a8ed-d7afdb738d74\") " Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.148004 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0a28822-e149-4041-a8ed-d7afdb738d74-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d0a28822-e149-4041-a8ed-d7afdb738d74" (UID: "d0a28822-e149-4041-a8ed-d7afdb738d74"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.148098 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0a28822-e149-4041-a8ed-d7afdb738d74-kube-api-access\") pod \"d0a28822-e149-4041-a8ed-d7afdb738d74\" (UID: \"d0a28822-e149-4041-a8ed-d7afdb738d74\") " Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.148385 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0a28822-e149-4041-a8ed-d7afdb738d74-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.154119 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a28822-e149-4041-a8ed-d7afdb738d74-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d0a28822-e149-4041-a8ed-d7afdb738d74" (UID: "d0a28822-e149-4041-a8ed-d7afdb738d74"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.249772 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0a28822-e149-4041-a8ed-d7afdb738d74-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.379118 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.844090 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"d0a28822-e149-4041-a8ed-d7afdb738d74","Type":"ContainerDied","Data":"cda64b738ff7ec7ca594cd589fa758bf18ec370eba24d4a9926bef63a154015b"} Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.844126 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cda64b738ff7ec7ca594cd589fa758bf18ec370eba24d4a9926bef63a154015b" Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.844182 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.867339 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.867385 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.948186 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:51 crc kubenswrapper[4781]: I1208 20:07:51.992784 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-546nt"] Dec 08 20:07:52 crc kubenswrapper[4781]: I1208 20:07:52.891288 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:52 crc kubenswrapper[4781]: I1208 20:07:52.930214 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwcdr"] Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.077465 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 08 20:07:54 crc kubenswrapper[4781]: E1208 20:07:54.077714 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a28822-e149-4041-a8ed-d7afdb738d74" containerName="pruner" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.077727 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a28822-e149-4041-a8ed-d7afdb738d74" containerName="pruner" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.077847 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a28822-e149-4041-a8ed-d7afdb738d74" containerName="pruner" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.078293 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.080569 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.082597 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.086521 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.098218 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00264077-cf52-457f-80f7-0d149562535a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"00264077-cf52-457f-80f7-0d149562535a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.098715 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/00264077-cf52-457f-80f7-0d149562535a-var-lock\") pod \"installer-9-crc\" (UID: \"00264077-cf52-457f-80f7-0d149562535a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.098828 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00264077-cf52-457f-80f7-0d149562535a-kube-api-access\") pod \"installer-9-crc\" (UID: \"00264077-cf52-457f-80f7-0d149562535a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.200203 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00264077-cf52-457f-80f7-0d149562535a-kube-api-access\") pod \"installer-9-crc\" (UID: \"00264077-cf52-457f-80f7-0d149562535a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.200639 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00264077-cf52-457f-80f7-0d149562535a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"00264077-cf52-457f-80f7-0d149562535a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.200717 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00264077-cf52-457f-80f7-0d149562535a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"00264077-cf52-457f-80f7-0d149562535a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.200814 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/00264077-cf52-457f-80f7-0d149562535a-var-lock\") pod \"installer-9-crc\" (UID: \"00264077-cf52-457f-80f7-0d149562535a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.200892 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/00264077-cf52-457f-80f7-0d149562535a-var-lock\") pod \"installer-9-crc\" (UID: \"00264077-cf52-457f-80f7-0d149562535a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.234466 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00264077-cf52-457f-80f7-0d149562535a-kube-api-access\") pod \"installer-9-crc\" (UID: \"00264077-cf52-457f-80f7-0d149562535a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.403490 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.579999 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.860366 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vwcdr" podUID="560ff3a7-c3ed-4a54-80df-e7047ec1af42" containerName="registry-server" containerID="cri-o://afc1b421422232193409d48b3890e5c1b539d483edf49f33f7d762b60ac7ba5a" gracePeriod=2 Dec 08 20:07:54 crc kubenswrapper[4781]: I1208 20:07:54.860850 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"00264077-cf52-457f-80f7-0d149562535a","Type":"ContainerStarted","Data":"9269ff92410a08fc13af558bd0bff1adaf355e1e75002ef7854320c3dc936646"} Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.188121 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.211204 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560ff3a7-c3ed-4a54-80df-e7047ec1af42-catalog-content\") pod \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\" (UID: \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\") " Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.211271 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560ff3a7-c3ed-4a54-80df-e7047ec1af42-utilities\") pod \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\" (UID: \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\") " Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.211306 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtqsz\" (UniqueName: \"kubernetes.io/projected/560ff3a7-c3ed-4a54-80df-e7047ec1af42-kube-api-access-mtqsz\") pod \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\" (UID: \"560ff3a7-c3ed-4a54-80df-e7047ec1af42\") " Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.212238 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/560ff3a7-c3ed-4a54-80df-e7047ec1af42-utilities" (OuterVolumeSpecName: "utilities") pod "560ff3a7-c3ed-4a54-80df-e7047ec1af42" (UID: "560ff3a7-c3ed-4a54-80df-e7047ec1af42"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.226108 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/560ff3a7-c3ed-4a54-80df-e7047ec1af42-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.240671 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/560ff3a7-c3ed-4a54-80df-e7047ec1af42-kube-api-access-mtqsz" (OuterVolumeSpecName: "kube-api-access-mtqsz") pod "560ff3a7-c3ed-4a54-80df-e7047ec1af42" (UID: "560ff3a7-c3ed-4a54-80df-e7047ec1af42"). InnerVolumeSpecName "kube-api-access-mtqsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.258450 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/560ff3a7-c3ed-4a54-80df-e7047ec1af42-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "560ff3a7-c3ed-4a54-80df-e7047ec1af42" (UID: "560ff3a7-c3ed-4a54-80df-e7047ec1af42"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.326694 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/560ff3a7-c3ed-4a54-80df-e7047ec1af42-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.326728 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtqsz\" (UniqueName: \"kubernetes.io/projected/560ff3a7-c3ed-4a54-80df-e7047ec1af42-kube-api-access-mtqsz\") on node \"crc\" DevicePath \"\"" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.874228 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"00264077-cf52-457f-80f7-0d149562535a","Type":"ContainerStarted","Data":"d1acdf6a7f9fe90ec06f15f6c8f7a2d2ea52271ace199b6ed17a5c6338eb4e77"} Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.877151 4781 generic.go:334] "Generic (PLEG): container finished" podID="560ff3a7-c3ed-4a54-80df-e7047ec1af42" containerID="afc1b421422232193409d48b3890e5c1b539d483edf49f33f7d762b60ac7ba5a" exitCode=0 Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.877186 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwcdr" event={"ID":"560ff3a7-c3ed-4a54-80df-e7047ec1af42","Type":"ContainerDied","Data":"afc1b421422232193409d48b3890e5c1b539d483edf49f33f7d762b60ac7ba5a"} Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.877207 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwcdr" event={"ID":"560ff3a7-c3ed-4a54-80df-e7047ec1af42","Type":"ContainerDied","Data":"7500b80a466e1f10e1226b4fd1b9ce4f376093cce3f8938a04b4ec9eca59292e"} Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.877235 4781 scope.go:117] "RemoveContainer" containerID="afc1b421422232193409d48b3890e5c1b539d483edf49f33f7d762b60ac7ba5a" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.877354 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwcdr" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.894020 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.8939997800000001 podStartE2EDuration="1.89399978s" podCreationTimestamp="2025-12-08 20:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:07:55.887161057 +0000 UTC m=+192.038444434" watchObservedRunningTime="2025-12-08 20:07:55.89399978 +0000 UTC m=+192.045283157" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.903121 4781 scope.go:117] "RemoveContainer" containerID="beb674ab211c50c911b33d7c77b71125e03a98d46c693a21e2df2055fe18f077" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.915307 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwcdr"] Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.917970 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwcdr"] Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.931547 4781 scope.go:117] "RemoveContainer" containerID="188ff3e4b08db8d576b25da0e5cfcbfd6d16504d456253fc9a72eff9b4ee2ed3" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.950445 4781 scope.go:117] "RemoveContainer" containerID="afc1b421422232193409d48b3890e5c1b539d483edf49f33f7d762b60ac7ba5a" Dec 08 20:07:55 crc kubenswrapper[4781]: E1208 20:07:55.950751 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afc1b421422232193409d48b3890e5c1b539d483edf49f33f7d762b60ac7ba5a\": container with ID starting with afc1b421422232193409d48b3890e5c1b539d483edf49f33f7d762b60ac7ba5a not found: ID does not exist" containerID="afc1b421422232193409d48b3890e5c1b539d483edf49f33f7d762b60ac7ba5a" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.950786 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afc1b421422232193409d48b3890e5c1b539d483edf49f33f7d762b60ac7ba5a"} err="failed to get container status \"afc1b421422232193409d48b3890e5c1b539d483edf49f33f7d762b60ac7ba5a\": rpc error: code = NotFound desc = could not find container \"afc1b421422232193409d48b3890e5c1b539d483edf49f33f7d762b60ac7ba5a\": container with ID starting with afc1b421422232193409d48b3890e5c1b539d483edf49f33f7d762b60ac7ba5a not found: ID does not exist" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.950828 4781 scope.go:117] "RemoveContainer" containerID="beb674ab211c50c911b33d7c77b71125e03a98d46c693a21e2df2055fe18f077" Dec 08 20:07:55 crc kubenswrapper[4781]: E1208 20:07:55.951214 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb674ab211c50c911b33d7c77b71125e03a98d46c693a21e2df2055fe18f077\": container with ID starting with beb674ab211c50c911b33d7c77b71125e03a98d46c693a21e2df2055fe18f077 not found: ID does not exist" containerID="beb674ab211c50c911b33d7c77b71125e03a98d46c693a21e2df2055fe18f077" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.951283 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb674ab211c50c911b33d7c77b71125e03a98d46c693a21e2df2055fe18f077"} err="failed to get container status \"beb674ab211c50c911b33d7c77b71125e03a98d46c693a21e2df2055fe18f077\": rpc error: code = NotFound desc = could not find container \"beb674ab211c50c911b33d7c77b71125e03a98d46c693a21e2df2055fe18f077\": container with ID starting with beb674ab211c50c911b33d7c77b71125e03a98d46c693a21e2df2055fe18f077 not found: ID does not exist" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.951304 4781 scope.go:117] "RemoveContainer" containerID="188ff3e4b08db8d576b25da0e5cfcbfd6d16504d456253fc9a72eff9b4ee2ed3" Dec 08 20:07:55 crc kubenswrapper[4781]: E1208 20:07:55.951646 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"188ff3e4b08db8d576b25da0e5cfcbfd6d16504d456253fc9a72eff9b4ee2ed3\": container with ID starting with 188ff3e4b08db8d576b25da0e5cfcbfd6d16504d456253fc9a72eff9b4ee2ed3 not found: ID does not exist" containerID="188ff3e4b08db8d576b25da0e5cfcbfd6d16504d456253fc9a72eff9b4ee2ed3" Dec 08 20:07:55 crc kubenswrapper[4781]: I1208 20:07:55.951698 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"188ff3e4b08db8d576b25da0e5cfcbfd6d16504d456253fc9a72eff9b4ee2ed3"} err="failed to get container status \"188ff3e4b08db8d576b25da0e5cfcbfd6d16504d456253fc9a72eff9b4ee2ed3\": rpc error: code = NotFound desc = could not find container \"188ff3e4b08db8d576b25da0e5cfcbfd6d16504d456253fc9a72eff9b4ee2ed3\": container with ID starting with 188ff3e4b08db8d576b25da0e5cfcbfd6d16504d456253fc9a72eff9b4ee2ed3 not found: ID does not exist" Dec 08 20:07:56 crc kubenswrapper[4781]: I1208 20:07:56.136134 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="560ff3a7-c3ed-4a54-80df-e7047ec1af42" path="/var/lib/kubelet/pods/560ff3a7-c3ed-4a54-80df-e7047ec1af42/volumes" Dec 08 20:07:56 crc kubenswrapper[4781]: I1208 20:07:56.883345 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vbq9n" event={"ID":"ee128c5f-b1ca-4f73-b1db-d643edb27970","Type":"ContainerStarted","Data":"ee7739dad9f20db62ab4ee24ec58cd22e1d5a6f133142f4b84e2a3512396f7af"} Dec 08 20:07:56 crc kubenswrapper[4781]: I1208 20:07:56.885799 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhcz4" event={"ID":"2e09d1ea-0001-4821-9c36-20ec4618fcfc","Type":"ContainerStarted","Data":"119c7df5bb5c06b6be314fb304ceb1399c11c07a5a3195ef881edc0ecb53e792"} Dec 08 20:07:56 crc kubenswrapper[4781]: I1208 20:07:56.888862 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb745" event={"ID":"ca98c5df-cacc-4016-a941-f02ebe37bc01","Type":"ContainerStarted","Data":"d24180362ec6f7e2607caaf75133d317886e921fb80a516497374272ababcac4"} Dec 08 20:07:57 crc kubenswrapper[4781]: I1208 20:07:57.895652 4781 generic.go:334] "Generic (PLEG): container finished" podID="ee128c5f-b1ca-4f73-b1db-d643edb27970" containerID="ee7739dad9f20db62ab4ee24ec58cd22e1d5a6f133142f4b84e2a3512396f7af" exitCode=0 Dec 08 20:07:57 crc kubenswrapper[4781]: I1208 20:07:57.895740 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vbq9n" event={"ID":"ee128c5f-b1ca-4f73-b1db-d643edb27970","Type":"ContainerDied","Data":"ee7739dad9f20db62ab4ee24ec58cd22e1d5a6f133142f4b84e2a3512396f7af"} Dec 08 20:07:57 crc kubenswrapper[4781]: I1208 20:07:57.898891 4781 generic.go:334] "Generic (PLEG): container finished" podID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" containerID="e6a1c44e81451f19d048840190b553f09570f34d4b2007700faa3a0303471b63" exitCode=0 Dec 08 20:07:57 crc kubenswrapper[4781]: I1208 20:07:57.898951 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2rbv" event={"ID":"a48498fe-4ab1-428b-bf97-d8c7fe2d002a","Type":"ContainerDied","Data":"e6a1c44e81451f19d048840190b553f09570f34d4b2007700faa3a0303471b63"} Dec 08 20:07:57 crc kubenswrapper[4781]: I1208 20:07:57.901158 4781 generic.go:334] "Generic (PLEG): container finished" podID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" containerID="119c7df5bb5c06b6be314fb304ceb1399c11c07a5a3195ef881edc0ecb53e792" exitCode=0 Dec 08 20:07:57 crc kubenswrapper[4781]: I1208 20:07:57.901202 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhcz4" event={"ID":"2e09d1ea-0001-4821-9c36-20ec4618fcfc","Type":"ContainerDied","Data":"119c7df5bb5c06b6be314fb304ceb1399c11c07a5a3195ef881edc0ecb53e792"} Dec 08 20:07:57 crc kubenswrapper[4781]: I1208 20:07:57.905104 4781 generic.go:334] "Generic (PLEG): container finished" podID="ca98c5df-cacc-4016-a941-f02ebe37bc01" containerID="d24180362ec6f7e2607caaf75133d317886e921fb80a516497374272ababcac4" exitCode=0 Dec 08 20:07:57 crc kubenswrapper[4781]: I1208 20:07:57.905138 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb745" event={"ID":"ca98c5df-cacc-4016-a941-f02ebe37bc01","Type":"ContainerDied","Data":"d24180362ec6f7e2607caaf75133d317886e921fb80a516497374272ababcac4"} Dec 08 20:07:58 crc kubenswrapper[4781]: I1208 20:07:58.912029 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6j47" event={"ID":"41421b22-7fd3-4c47-a406-1681ecda5d10","Type":"ContainerStarted","Data":"63d3ef64a12811d728e39cc355d52e313e0b5e08731d2942cc63e3a801fbea3c"} Dec 08 20:07:58 crc kubenswrapper[4781]: I1208 20:07:58.915107 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vbq9n" event={"ID":"ee128c5f-b1ca-4f73-b1db-d643edb27970","Type":"ContainerStarted","Data":"99f98c15bf28875ae7ed9c178bcaca9e61865d34e8155a76b3e1c6298bd380a0"} Dec 08 20:07:58 crc kubenswrapper[4781]: I1208 20:07:58.916976 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhcz4" event={"ID":"2e09d1ea-0001-4821-9c36-20ec4618fcfc","Type":"ContainerStarted","Data":"1e78d58c609646958ddea68029dd8ab53376d361c0b4f0e5ca17ae0ea62da878"} Dec 08 20:07:58 crc kubenswrapper[4781]: I1208 20:07:58.918883 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2rbv" event={"ID":"a48498fe-4ab1-428b-bf97-d8c7fe2d002a","Type":"ContainerStarted","Data":"33adac9c2cbef41f0756bc8f865026136409054b83fadd53ebea6a6315d4fc67"} Dec 08 20:07:58 crc kubenswrapper[4781]: I1208 20:07:58.920865 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb745" event={"ID":"ca98c5df-cacc-4016-a941-f02ebe37bc01","Type":"ContainerStarted","Data":"e677d2575ebae42aec2bab1914b427f15ea3aee29b6585170a6f469f56f777c1"} Dec 08 20:07:58 crc kubenswrapper[4781]: I1208 20:07:58.948552 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lb745" podStartSLOduration=3.117423822 podStartE2EDuration="46.948534904s" podCreationTimestamp="2025-12-08 20:07:12 +0000 UTC" firstStartedPulling="2025-12-08 20:07:14.465422036 +0000 UTC m=+150.616705413" lastFinishedPulling="2025-12-08 20:07:58.296533118 +0000 UTC m=+194.447816495" observedRunningTime="2025-12-08 20:07:58.94554172 +0000 UTC m=+195.096825097" watchObservedRunningTime="2025-12-08 20:07:58.948534904 +0000 UTC m=+195.099818281" Dec 08 20:07:58 crc kubenswrapper[4781]: I1208 20:07:58.966090 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lhcz4" podStartSLOduration=2.031259714 podStartE2EDuration="46.966072588s" podCreationTimestamp="2025-12-08 20:07:12 +0000 UTC" firstStartedPulling="2025-12-08 20:07:13.401333459 +0000 UTC m=+149.552616836" lastFinishedPulling="2025-12-08 20:07:58.336146333 +0000 UTC m=+194.487429710" observedRunningTime="2025-12-08 20:07:58.96297393 +0000 UTC m=+195.114257337" watchObservedRunningTime="2025-12-08 20:07:58.966072588 +0000 UTC m=+195.117355965" Dec 08 20:07:58 crc kubenswrapper[4781]: I1208 20:07:58.988806 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vbq9n" podStartSLOduration=1.9888691870000001 podStartE2EDuration="47.988790267s" podCreationTimestamp="2025-12-08 20:07:11 +0000 UTC" firstStartedPulling="2025-12-08 20:07:12.35459569 +0000 UTC m=+148.505879067" lastFinishedPulling="2025-12-08 20:07:58.35451677 +0000 UTC m=+194.505800147" observedRunningTime="2025-12-08 20:07:58.986312627 +0000 UTC m=+195.137596024" watchObservedRunningTime="2025-12-08 20:07:58.988790267 +0000 UTC m=+195.140073644" Dec 08 20:07:59 crc kubenswrapper[4781]: I1208 20:07:59.007397 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d2rbv" podStartSLOduration=3.971094954 podStartE2EDuration="51.007379531s" podCreationTimestamp="2025-12-08 20:07:08 +0000 UTC" firstStartedPulling="2025-12-08 20:07:11.265119548 +0000 UTC m=+147.416402925" lastFinishedPulling="2025-12-08 20:07:58.301404125 +0000 UTC m=+194.452687502" observedRunningTime="2025-12-08 20:07:59.004993493 +0000 UTC m=+195.156276870" watchObservedRunningTime="2025-12-08 20:07:59.007379531 +0000 UTC m=+195.158662908" Dec 08 20:07:59 crc kubenswrapper[4781]: I1208 20:07:59.300598 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:07:59 crc kubenswrapper[4781]: I1208 20:07:59.300649 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:07:59 crc kubenswrapper[4781]: I1208 20:07:59.928982 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghptn" event={"ID":"924b676c-8556-4d07-bf4b-a3607f3780d1","Type":"ContainerStarted","Data":"1cf51ed6fd514d78afd6c6f6534e005a1a00e8bf72b53cc2f2fbd76b7384d528"} Dec 08 20:07:59 crc kubenswrapper[4781]: I1208 20:07:59.931860 4781 generic.go:334] "Generic (PLEG): container finished" podID="41421b22-7fd3-4c47-a406-1681ecda5d10" containerID="63d3ef64a12811d728e39cc355d52e313e0b5e08731d2942cc63e3a801fbea3c" exitCode=0 Dec 08 20:07:59 crc kubenswrapper[4781]: I1208 20:07:59.931961 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6j47" event={"ID":"41421b22-7fd3-4c47-a406-1681ecda5d10","Type":"ContainerDied","Data":"63d3ef64a12811d728e39cc355d52e313e0b5e08731d2942cc63e3a801fbea3c"} Dec 08 20:07:59 crc kubenswrapper[4781]: I1208 20:07:59.947658 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:07:59 crc kubenswrapper[4781]: I1208 20:07:59.947718 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:08:00 crc kubenswrapper[4781]: I1208 20:08:00.347572 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-d2rbv" podUID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" containerName="registry-server" probeResult="failure" output=< Dec 08 20:08:00 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 08 20:08:00 crc kubenswrapper[4781]: > Dec 08 20:08:00 crc kubenswrapper[4781]: I1208 20:08:00.937847 4781 generic.go:334] "Generic (PLEG): container finished" podID="924b676c-8556-4d07-bf4b-a3607f3780d1" containerID="1cf51ed6fd514d78afd6c6f6534e005a1a00e8bf72b53cc2f2fbd76b7384d528" exitCode=0 Dec 08 20:08:00 crc kubenswrapper[4781]: I1208 20:08:00.937943 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghptn" event={"ID":"924b676c-8556-4d07-bf4b-a3607f3780d1","Type":"ContainerDied","Data":"1cf51ed6fd514d78afd6c6f6534e005a1a00e8bf72b53cc2f2fbd76b7384d528"} Dec 08 20:08:00 crc kubenswrapper[4781]: I1208 20:08:00.940025 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6j47" event={"ID":"41421b22-7fd3-4c47-a406-1681ecda5d10","Type":"ContainerStarted","Data":"3bba5697fe28447c98d7bc5c24ab979216ba8c499db68dc10ad0655a2dccdd9d"} Dec 08 20:08:00 crc kubenswrapper[4781]: I1208 20:08:00.942175 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5g7vv" event={"ID":"aef6b08e-0164-4a0b-800b-c7a61bf2e35d","Type":"ContainerStarted","Data":"c96b789c5bc257d2de11c6d58099a6df97db3a22f405ee38c2d0d684806eaa02"} Dec 08 20:08:00 crc kubenswrapper[4781]: I1208 20:08:00.974425 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t6j47" podStartSLOduration=2.924922783 podStartE2EDuration="51.974408688s" podCreationTimestamp="2025-12-08 20:07:09 +0000 UTC" firstStartedPulling="2025-12-08 20:07:11.259223682 +0000 UTC m=+147.410507059" lastFinishedPulling="2025-12-08 20:08:00.308709597 +0000 UTC m=+196.459992964" observedRunningTime="2025-12-08 20:08:00.972831584 +0000 UTC m=+197.124114961" watchObservedRunningTime="2025-12-08 20:08:00.974408688 +0000 UTC m=+197.125692095" Dec 08 20:08:01 crc kubenswrapper[4781]: I1208 20:08:01.500829 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:08:01 crc kubenswrapper[4781]: I1208 20:08:01.500882 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:08:01 crc kubenswrapper[4781]: I1208 20:08:01.542493 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:08:01 crc kubenswrapper[4781]: I1208 20:08:01.956371 4781 generic.go:334] "Generic (PLEG): container finished" podID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" containerID="c96b789c5bc257d2de11c6d58099a6df97db3a22f405ee38c2d0d684806eaa02" exitCode=0 Dec 08 20:08:01 crc kubenswrapper[4781]: I1208 20:08:01.956744 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5g7vv" event={"ID":"aef6b08e-0164-4a0b-800b-c7a61bf2e35d","Type":"ContainerDied","Data":"c96b789c5bc257d2de11c6d58099a6df97db3a22f405ee38c2d0d684806eaa02"} Dec 08 20:08:01 crc kubenswrapper[4781]: I1208 20:08:01.961108 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghptn" event={"ID":"924b676c-8556-4d07-bf4b-a3607f3780d1","Type":"ContainerStarted","Data":"35afeb83e9568984eaca0bbb6cdfc93d2ba638e677159253be409ecc7ac52ea8"} Dec 08 20:08:01 crc kubenswrapper[4781]: I1208 20:08:01.999579 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ghptn" podStartSLOduration=2.934251266 podStartE2EDuration="52.999554769s" podCreationTimestamp="2025-12-08 20:07:09 +0000 UTC" firstStartedPulling="2025-12-08 20:07:11.257207815 +0000 UTC m=+147.408491192" lastFinishedPulling="2025-12-08 20:08:01.322511318 +0000 UTC m=+197.473794695" observedRunningTime="2025-12-08 20:08:01.99672513 +0000 UTC m=+198.148008507" watchObservedRunningTime="2025-12-08 20:08:01.999554769 +0000 UTC m=+198.150838146" Dec 08 20:08:02 crc kubenswrapper[4781]: I1208 20:08:02.469036 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:08:02 crc kubenswrapper[4781]: I1208 20:08:02.469231 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:08:02 crc kubenswrapper[4781]: I1208 20:08:02.897040 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:08:02 crc kubenswrapper[4781]: I1208 20:08:02.897106 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:08:03 crc kubenswrapper[4781]: I1208 20:08:03.529001 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lhcz4" podUID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" containerName="registry-server" probeResult="failure" output=< Dec 08 20:08:03 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 08 20:08:03 crc kubenswrapper[4781]: > Dec 08 20:08:03 crc kubenswrapper[4781]: I1208 20:08:03.935354 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lb745" podUID="ca98c5df-cacc-4016-a941-f02ebe37bc01" containerName="registry-server" probeResult="failure" output=< Dec 08 20:08:03 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 08 20:08:03 crc kubenswrapper[4781]: > Dec 08 20:08:05 crc kubenswrapper[4781]: I1208 20:08:05.988933 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5g7vv" event={"ID":"aef6b08e-0164-4a0b-800b-c7a61bf2e35d","Type":"ContainerStarted","Data":"65f3197c322cd1e90772097e537257dd64e8acd8a25dd24425f27e95bbdd241c"} Dec 08 20:08:06 crc kubenswrapper[4781]: I1208 20:08:06.017464 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5g7vv" podStartSLOduration=3.992309174 podStartE2EDuration="57.017445085s" podCreationTimestamp="2025-12-08 20:07:09 +0000 UTC" firstStartedPulling="2025-12-08 20:07:11.270956482 +0000 UTC m=+147.422239859" lastFinishedPulling="2025-12-08 20:08:04.296092393 +0000 UTC m=+200.447375770" observedRunningTime="2025-12-08 20:08:06.013185983 +0000 UTC m=+202.164469360" watchObservedRunningTime="2025-12-08 20:08:06.017445085 +0000 UTC m=+202.168728462" Dec 08 20:08:09 crc kubenswrapper[4781]: I1208 20:08:09.364464 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:08:09 crc kubenswrapper[4781]: I1208 20:08:09.407224 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:08:09 crc kubenswrapper[4781]: I1208 20:08:09.515054 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:08:09 crc kubenswrapper[4781]: I1208 20:08:09.515097 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:08:09 crc kubenswrapper[4781]: I1208 20:08:09.553393 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:08:09 crc kubenswrapper[4781]: I1208 20:08:09.742364 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:08:09 crc kubenswrapper[4781]: I1208 20:08:09.742419 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:08:09 crc kubenswrapper[4781]: I1208 20:08:09.778582 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:08:09 crc kubenswrapper[4781]: I1208 20:08:09.993428 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:08:09 crc kubenswrapper[4781]: I1208 20:08:09.993506 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:08:10 crc kubenswrapper[4781]: I1208 20:08:10.039675 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:08:10 crc kubenswrapper[4781]: I1208 20:08:10.054167 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:08:10 crc kubenswrapper[4781]: I1208 20:08:10.056438 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:08:10 crc kubenswrapper[4781]: I1208 20:08:10.092111 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:08:10 crc kubenswrapper[4781]: I1208 20:08:10.539754 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t6j47"] Dec 08 20:08:11 crc kubenswrapper[4781]: I1208 20:08:11.547381 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:08:11 crc kubenswrapper[4781]: I1208 20:08:11.938040 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5g7vv"] Dec 08 20:08:12 crc kubenswrapper[4781]: I1208 20:08:12.017999 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5g7vv" podUID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" containerName="registry-server" containerID="cri-o://65f3197c322cd1e90772097e537257dd64e8acd8a25dd24425f27e95bbdd241c" gracePeriod=2 Dec 08 20:08:12 crc kubenswrapper[4781]: I1208 20:08:12.018591 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t6j47" podUID="41421b22-7fd3-4c47-a406-1681ecda5d10" containerName="registry-server" containerID="cri-o://3bba5697fe28447c98d7bc5c24ab979216ba8c499db68dc10ad0655a2dccdd9d" gracePeriod=2 Dec 08 20:08:12 crc kubenswrapper[4781]: I1208 20:08:12.503880 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:08:12 crc kubenswrapper[4781]: I1208 20:08:12.550506 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:08:12 crc kubenswrapper[4781]: I1208 20:08:12.958677 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.002461 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.031944 4781 generic.go:334] "Generic (PLEG): container finished" podID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" containerID="65f3197c322cd1e90772097e537257dd64e8acd8a25dd24425f27e95bbdd241c" exitCode=0 Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.031983 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5g7vv" event={"ID":"aef6b08e-0164-4a0b-800b-c7a61bf2e35d","Type":"ContainerDied","Data":"65f3197c322cd1e90772097e537257dd64e8acd8a25dd24425f27e95bbdd241c"} Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.662600 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.727478 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.861892 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxmmh\" (UniqueName: \"kubernetes.io/projected/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-kube-api-access-zxmmh\") pod \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\" (UID: \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\") " Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.861988 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-utilities\") pod \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\" (UID: \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\") " Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.862052 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41421b22-7fd3-4c47-a406-1681ecda5d10-catalog-content\") pod \"41421b22-7fd3-4c47-a406-1681ecda5d10\" (UID: \"41421b22-7fd3-4c47-a406-1681ecda5d10\") " Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.862083 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41421b22-7fd3-4c47-a406-1681ecda5d10-utilities\") pod \"41421b22-7fd3-4c47-a406-1681ecda5d10\" (UID: \"41421b22-7fd3-4c47-a406-1681ecda5d10\") " Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.862108 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qffs\" (UniqueName: \"kubernetes.io/projected/41421b22-7fd3-4c47-a406-1681ecda5d10-kube-api-access-9qffs\") pod \"41421b22-7fd3-4c47-a406-1681ecda5d10\" (UID: \"41421b22-7fd3-4c47-a406-1681ecda5d10\") " Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.862142 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-catalog-content\") pod \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\" (UID: \"aef6b08e-0164-4a0b-800b-c7a61bf2e35d\") " Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.867845 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-kube-api-access-zxmmh" (OuterVolumeSpecName: "kube-api-access-zxmmh") pod "aef6b08e-0164-4a0b-800b-c7a61bf2e35d" (UID: "aef6b08e-0164-4a0b-800b-c7a61bf2e35d"). InnerVolumeSpecName "kube-api-access-zxmmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.868681 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-utilities" (OuterVolumeSpecName: "utilities") pod "aef6b08e-0164-4a0b-800b-c7a61bf2e35d" (UID: "aef6b08e-0164-4a0b-800b-c7a61bf2e35d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.869937 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41421b22-7fd3-4c47-a406-1681ecda5d10-kube-api-access-9qffs" (OuterVolumeSpecName: "kube-api-access-9qffs") pod "41421b22-7fd3-4c47-a406-1681ecda5d10" (UID: "41421b22-7fd3-4c47-a406-1681ecda5d10"). InnerVolumeSpecName "kube-api-access-9qffs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.871353 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41421b22-7fd3-4c47-a406-1681ecda5d10-utilities" (OuterVolumeSpecName: "utilities") pod "41421b22-7fd3-4c47-a406-1681ecda5d10" (UID: "41421b22-7fd3-4c47-a406-1681ecda5d10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.914449 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aef6b08e-0164-4a0b-800b-c7a61bf2e35d" (UID: "aef6b08e-0164-4a0b-800b-c7a61bf2e35d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.933807 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41421b22-7fd3-4c47-a406-1681ecda5d10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41421b22-7fd3-4c47-a406-1681ecda5d10" (UID: "41421b22-7fd3-4c47-a406-1681ecda5d10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.963330 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.963367 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41421b22-7fd3-4c47-a406-1681ecda5d10-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.963378 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41421b22-7fd3-4c47-a406-1681ecda5d10-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.963387 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qffs\" (UniqueName: \"kubernetes.io/projected/41421b22-7fd3-4c47-a406-1681ecda5d10-kube-api-access-9qffs\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.963396 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:13 crc kubenswrapper[4781]: I1208 20:08:13.963407 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxmmh\" (UniqueName: \"kubernetes.io/projected/aef6b08e-0164-4a0b-800b-c7a61bf2e35d-kube-api-access-zxmmh\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.041413 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5g7vv" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.041414 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5g7vv" event={"ID":"aef6b08e-0164-4a0b-800b-c7a61bf2e35d","Type":"ContainerDied","Data":"2ba37bf137c56b7d571d70e72ed412f5395bc7cd8f53e92e36bc71ee97c73e43"} Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.041484 4781 scope.go:117] "RemoveContainer" containerID="65f3197c322cd1e90772097e537257dd64e8acd8a25dd24425f27e95bbdd241c" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.045336 4781 generic.go:334] "Generic (PLEG): container finished" podID="41421b22-7fd3-4c47-a406-1681ecda5d10" containerID="3bba5697fe28447c98d7bc5c24ab979216ba8c499db68dc10ad0655a2dccdd9d" exitCode=0 Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.045376 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6j47" event={"ID":"41421b22-7fd3-4c47-a406-1681ecda5d10","Type":"ContainerDied","Data":"3bba5697fe28447c98d7bc5c24ab979216ba8c499db68dc10ad0655a2dccdd9d"} Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.045404 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6j47" event={"ID":"41421b22-7fd3-4c47-a406-1681ecda5d10","Type":"ContainerDied","Data":"9fdba995ec384d162e2b6f04530690e299e93d441dd9469410582ac0415c71c8"} Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.045522 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6j47" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.063547 4781 scope.go:117] "RemoveContainer" containerID="c96b789c5bc257d2de11c6d58099a6df97db3a22f405ee38c2d0d684806eaa02" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.082599 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5g7vv"] Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.084186 4781 scope.go:117] "RemoveContainer" containerID="1ed647ccfe43b9bfa9a3ee7dc2fdc449128e9dafa7e4cfa389dc608644a4bfe9" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.085664 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5g7vv"] Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.094714 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t6j47"] Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.097636 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t6j47"] Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.101080 4781 scope.go:117] "RemoveContainer" containerID="3bba5697fe28447c98d7bc5c24ab979216ba8c499db68dc10ad0655a2dccdd9d" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.116519 4781 scope.go:117] "RemoveContainer" containerID="63d3ef64a12811d728e39cc355d52e313e0b5e08731d2942cc63e3a801fbea3c" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.129325 4781 scope.go:117] "RemoveContainer" containerID="f62151b9a37307fbb0e31c11383803e6d5131578e1847382f422c264bb10a968" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.132198 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41421b22-7fd3-4c47-a406-1681ecda5d10" path="/var/lib/kubelet/pods/41421b22-7fd3-4c47-a406-1681ecda5d10/volumes" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.132883 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" path="/var/lib/kubelet/pods/aef6b08e-0164-4a0b-800b-c7a61bf2e35d/volumes" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.143467 4781 scope.go:117] "RemoveContainer" containerID="3bba5697fe28447c98d7bc5c24ab979216ba8c499db68dc10ad0655a2dccdd9d" Dec 08 20:08:14 crc kubenswrapper[4781]: E1208 20:08:14.143946 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bba5697fe28447c98d7bc5c24ab979216ba8c499db68dc10ad0655a2dccdd9d\": container with ID starting with 3bba5697fe28447c98d7bc5c24ab979216ba8c499db68dc10ad0655a2dccdd9d not found: ID does not exist" containerID="3bba5697fe28447c98d7bc5c24ab979216ba8c499db68dc10ad0655a2dccdd9d" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.144037 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bba5697fe28447c98d7bc5c24ab979216ba8c499db68dc10ad0655a2dccdd9d"} err="failed to get container status \"3bba5697fe28447c98d7bc5c24ab979216ba8c499db68dc10ad0655a2dccdd9d\": rpc error: code = NotFound desc = could not find container \"3bba5697fe28447c98d7bc5c24ab979216ba8c499db68dc10ad0655a2dccdd9d\": container with ID starting with 3bba5697fe28447c98d7bc5c24ab979216ba8c499db68dc10ad0655a2dccdd9d not found: ID does not exist" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.145489 4781 scope.go:117] "RemoveContainer" containerID="63d3ef64a12811d728e39cc355d52e313e0b5e08731d2942cc63e3a801fbea3c" Dec 08 20:08:14 crc kubenswrapper[4781]: E1208 20:08:14.145869 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63d3ef64a12811d728e39cc355d52e313e0b5e08731d2942cc63e3a801fbea3c\": container with ID starting with 63d3ef64a12811d728e39cc355d52e313e0b5e08731d2942cc63e3a801fbea3c not found: ID does not exist" containerID="63d3ef64a12811d728e39cc355d52e313e0b5e08731d2942cc63e3a801fbea3c" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.145907 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63d3ef64a12811d728e39cc355d52e313e0b5e08731d2942cc63e3a801fbea3c"} err="failed to get container status \"63d3ef64a12811d728e39cc355d52e313e0b5e08731d2942cc63e3a801fbea3c\": rpc error: code = NotFound desc = could not find container \"63d3ef64a12811d728e39cc355d52e313e0b5e08731d2942cc63e3a801fbea3c\": container with ID starting with 63d3ef64a12811d728e39cc355d52e313e0b5e08731d2942cc63e3a801fbea3c not found: ID does not exist" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.145941 4781 scope.go:117] "RemoveContainer" containerID="f62151b9a37307fbb0e31c11383803e6d5131578e1847382f422c264bb10a968" Dec 08 20:08:14 crc kubenswrapper[4781]: E1208 20:08:14.146201 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f62151b9a37307fbb0e31c11383803e6d5131578e1847382f422c264bb10a968\": container with ID starting with f62151b9a37307fbb0e31c11383803e6d5131578e1847382f422c264bb10a968 not found: ID does not exist" containerID="f62151b9a37307fbb0e31c11383803e6d5131578e1847382f422c264bb10a968" Dec 08 20:08:14 crc kubenswrapper[4781]: I1208 20:08:14.146230 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f62151b9a37307fbb0e31c11383803e6d5131578e1847382f422c264bb10a968"} err="failed to get container status \"f62151b9a37307fbb0e31c11383803e6d5131578e1847382f422c264bb10a968\": rpc error: code = NotFound desc = could not find container \"f62151b9a37307fbb0e31c11383803e6d5131578e1847382f422c264bb10a968\": container with ID starting with f62151b9a37307fbb0e31c11383803e6d5131578e1847382f422c264bb10a968 not found: ID does not exist" Dec 08 20:08:16 crc kubenswrapper[4781]: I1208 20:08:16.336495 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lb745"] Dec 08 20:08:16 crc kubenswrapper[4781]: I1208 20:08:16.338192 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lb745" podUID="ca98c5df-cacc-4016-a941-f02ebe37bc01" containerName="registry-server" containerID="cri-o://e677d2575ebae42aec2bab1914b427f15ea3aee29b6585170a6f469f56f777c1" gracePeriod=2 Dec 08 20:08:16 crc kubenswrapper[4781]: I1208 20:08:16.668738 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:08:16 crc kubenswrapper[4781]: I1208 20:08:16.800844 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8jgk\" (UniqueName: \"kubernetes.io/projected/ca98c5df-cacc-4016-a941-f02ebe37bc01-kube-api-access-w8jgk\") pod \"ca98c5df-cacc-4016-a941-f02ebe37bc01\" (UID: \"ca98c5df-cacc-4016-a941-f02ebe37bc01\") " Dec 08 20:08:16 crc kubenswrapper[4781]: I1208 20:08:16.800951 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca98c5df-cacc-4016-a941-f02ebe37bc01-utilities\") pod \"ca98c5df-cacc-4016-a941-f02ebe37bc01\" (UID: \"ca98c5df-cacc-4016-a941-f02ebe37bc01\") " Dec 08 20:08:16 crc kubenswrapper[4781]: I1208 20:08:16.800994 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca98c5df-cacc-4016-a941-f02ebe37bc01-catalog-content\") pod \"ca98c5df-cacc-4016-a941-f02ebe37bc01\" (UID: \"ca98c5df-cacc-4016-a941-f02ebe37bc01\") " Dec 08 20:08:16 crc kubenswrapper[4781]: I1208 20:08:16.802113 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca98c5df-cacc-4016-a941-f02ebe37bc01-utilities" (OuterVolumeSpecName: "utilities") pod "ca98c5df-cacc-4016-a941-f02ebe37bc01" (UID: "ca98c5df-cacc-4016-a941-f02ebe37bc01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:08:16 crc kubenswrapper[4781]: I1208 20:08:16.806097 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca98c5df-cacc-4016-a941-f02ebe37bc01-kube-api-access-w8jgk" (OuterVolumeSpecName: "kube-api-access-w8jgk") pod "ca98c5df-cacc-4016-a941-f02ebe37bc01" (UID: "ca98c5df-cacc-4016-a941-f02ebe37bc01"). InnerVolumeSpecName "kube-api-access-w8jgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:08:16 crc kubenswrapper[4781]: I1208 20:08:16.902648 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8jgk\" (UniqueName: \"kubernetes.io/projected/ca98c5df-cacc-4016-a941-f02ebe37bc01-kube-api-access-w8jgk\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:16 crc kubenswrapper[4781]: I1208 20:08:16.902977 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca98c5df-cacc-4016-a941-f02ebe37bc01-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:16 crc kubenswrapper[4781]: I1208 20:08:16.936222 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca98c5df-cacc-4016-a941-f02ebe37bc01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca98c5df-cacc-4016-a941-f02ebe37bc01" (UID: "ca98c5df-cacc-4016-a941-f02ebe37bc01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.004017 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca98c5df-cacc-4016-a941-f02ebe37bc01-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.024426 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-546nt" podUID="a0818ea3-b629-47a3-8edb-d77e60a23068" containerName="oauth-openshift" containerID="cri-o://d4d4e5e6996266ccbb5991ed7666c810e25c3c2e730d544895b9bbe94153cd35" gracePeriod=15 Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.063784 4781 generic.go:334] "Generic (PLEG): container finished" podID="ca98c5df-cacc-4016-a941-f02ebe37bc01" containerID="e677d2575ebae42aec2bab1914b427f15ea3aee29b6585170a6f469f56f777c1" exitCode=0 Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.063823 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb745" event={"ID":"ca98c5df-cacc-4016-a941-f02ebe37bc01","Type":"ContainerDied","Data":"e677d2575ebae42aec2bab1914b427f15ea3aee29b6585170a6f469f56f777c1"} Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.063848 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lb745" event={"ID":"ca98c5df-cacc-4016-a941-f02ebe37bc01","Type":"ContainerDied","Data":"e827a980b2a1d3eab4f6da265ec3e6d22a9b22f3961885d9a5429017f0d5755d"} Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.063864 4781 scope.go:117] "RemoveContainer" containerID="e677d2575ebae42aec2bab1914b427f15ea3aee29b6585170a6f469f56f777c1" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.063868 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lb745" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.114024 4781 scope.go:117] "RemoveContainer" containerID="d24180362ec6f7e2607caaf75133d317886e921fb80a516497374272ababcac4" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.127737 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lb745"] Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.130221 4781 scope.go:117] "RemoveContainer" containerID="9c58816d18c0a186c778bdca85e30960855616475c645145cdd058adc5907af6" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.130342 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lb745"] Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.159783 4781 scope.go:117] "RemoveContainer" containerID="e677d2575ebae42aec2bab1914b427f15ea3aee29b6585170a6f469f56f777c1" Dec 08 20:08:17 crc kubenswrapper[4781]: E1208 20:08:17.160347 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e677d2575ebae42aec2bab1914b427f15ea3aee29b6585170a6f469f56f777c1\": container with ID starting with e677d2575ebae42aec2bab1914b427f15ea3aee29b6585170a6f469f56f777c1 not found: ID does not exist" containerID="e677d2575ebae42aec2bab1914b427f15ea3aee29b6585170a6f469f56f777c1" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.160380 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e677d2575ebae42aec2bab1914b427f15ea3aee29b6585170a6f469f56f777c1"} err="failed to get container status \"e677d2575ebae42aec2bab1914b427f15ea3aee29b6585170a6f469f56f777c1\": rpc error: code = NotFound desc = could not find container \"e677d2575ebae42aec2bab1914b427f15ea3aee29b6585170a6f469f56f777c1\": container with ID starting with e677d2575ebae42aec2bab1914b427f15ea3aee29b6585170a6f469f56f777c1 not found: ID does not exist" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.160405 4781 scope.go:117] "RemoveContainer" containerID="d24180362ec6f7e2607caaf75133d317886e921fb80a516497374272ababcac4" Dec 08 20:08:17 crc kubenswrapper[4781]: E1208 20:08:17.160731 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d24180362ec6f7e2607caaf75133d317886e921fb80a516497374272ababcac4\": container with ID starting with d24180362ec6f7e2607caaf75133d317886e921fb80a516497374272ababcac4 not found: ID does not exist" containerID="d24180362ec6f7e2607caaf75133d317886e921fb80a516497374272ababcac4" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.160754 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24180362ec6f7e2607caaf75133d317886e921fb80a516497374272ababcac4"} err="failed to get container status \"d24180362ec6f7e2607caaf75133d317886e921fb80a516497374272ababcac4\": rpc error: code = NotFound desc = could not find container \"d24180362ec6f7e2607caaf75133d317886e921fb80a516497374272ababcac4\": container with ID starting with d24180362ec6f7e2607caaf75133d317886e921fb80a516497374272ababcac4 not found: ID does not exist" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.160770 4781 scope.go:117] "RemoveContainer" containerID="9c58816d18c0a186c778bdca85e30960855616475c645145cdd058adc5907af6" Dec 08 20:08:17 crc kubenswrapper[4781]: E1208 20:08:17.161221 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c58816d18c0a186c778bdca85e30960855616475c645145cdd058adc5907af6\": container with ID starting with 9c58816d18c0a186c778bdca85e30960855616475c645145cdd058adc5907af6 not found: ID does not exist" containerID="9c58816d18c0a186c778bdca85e30960855616475c645145cdd058adc5907af6" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.161250 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c58816d18c0a186c778bdca85e30960855616475c645145cdd058adc5907af6"} err="failed to get container status \"9c58816d18c0a186c778bdca85e30960855616475c645145cdd058adc5907af6\": rpc error: code = NotFound desc = could not find container \"9c58816d18c0a186c778bdca85e30960855616475c645145cdd058adc5907af6\": container with ID starting with 9c58816d18c0a186c778bdca85e30960855616475c645145cdd058adc5907af6 not found: ID does not exist" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.317279 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.509859 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-serving-cert\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.509934 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz9px\" (UniqueName: \"kubernetes.io/projected/a0818ea3-b629-47a3-8edb-d77e60a23068-kube-api-access-kz9px\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.509971 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-login\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.509991 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-audit-policies\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.510010 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-cliconfig\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.510039 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-ocp-branding-template\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.510067 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-service-ca\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.510102 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-idp-0-file-data\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.510136 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-session\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.510153 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0818ea3-b629-47a3-8edb-d77e60a23068-audit-dir\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.510172 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-provider-selection\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.510188 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-trusted-ca-bundle\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.510204 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-router-certs\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.510222 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-error\") pod \"a0818ea3-b629-47a3-8edb-d77e60a23068\" (UID: \"a0818ea3-b629-47a3-8edb-d77e60a23068\") " Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.510859 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.510907 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0818ea3-b629-47a3-8edb-d77e60a23068-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.511188 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.511364 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.511594 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.514506 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.514745 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.514936 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0818ea3-b629-47a3-8edb-d77e60a23068-kube-api-access-kz9px" (OuterVolumeSpecName: "kube-api-access-kz9px") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "kube-api-access-kz9px". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.514999 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.515093 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.515312 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.515347 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.519116 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.520283 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "a0818ea3-b629-47a3-8edb-d77e60a23068" (UID: "a0818ea3-b629-47a3-8edb-d77e60a23068"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.611663 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz9px\" (UniqueName: \"kubernetes.io/projected/a0818ea3-b629-47a3-8edb-d77e60a23068-kube-api-access-kz9px\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.611697 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.612045 4781 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.612162 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.612178 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.612188 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.612197 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.612208 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.612217 4781 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a0818ea3-b629-47a3-8edb-d77e60a23068-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.612225 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.612234 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.612243 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.612252 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:17 crc kubenswrapper[4781]: I1208 20:08:17.612260 4781 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0818ea3-b629-47a3-8edb-d77e60a23068-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.075295 4781 generic.go:334] "Generic (PLEG): container finished" podID="a0818ea3-b629-47a3-8edb-d77e60a23068" containerID="d4d4e5e6996266ccbb5991ed7666c810e25c3c2e730d544895b9bbe94153cd35" exitCode=0 Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.075353 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-546nt" event={"ID":"a0818ea3-b629-47a3-8edb-d77e60a23068","Type":"ContainerDied","Data":"d4d4e5e6996266ccbb5991ed7666c810e25c3c2e730d544895b9bbe94153cd35"} Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.075720 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-546nt" event={"ID":"a0818ea3-b629-47a3-8edb-d77e60a23068","Type":"ContainerDied","Data":"22862a9ab9d55ac6b7d37ff8b6605b2384b88699d5396ed78868badce08394b8"} Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.075382 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-546nt" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.075974 4781 scope.go:117] "RemoveContainer" containerID="d4d4e5e6996266ccbb5991ed7666c810e25c3c2e730d544895b9bbe94153cd35" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.095250 4781 scope.go:117] "RemoveContainer" containerID="d4d4e5e6996266ccbb5991ed7666c810e25c3c2e730d544895b9bbe94153cd35" Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.095780 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d4e5e6996266ccbb5991ed7666c810e25c3c2e730d544895b9bbe94153cd35\": container with ID starting with d4d4e5e6996266ccbb5991ed7666c810e25c3c2e730d544895b9bbe94153cd35 not found: ID does not exist" containerID="d4d4e5e6996266ccbb5991ed7666c810e25c3c2e730d544895b9bbe94153cd35" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.095885 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d4e5e6996266ccbb5991ed7666c810e25c3c2e730d544895b9bbe94153cd35"} err="failed to get container status \"d4d4e5e6996266ccbb5991ed7666c810e25c3c2e730d544895b9bbe94153cd35\": rpc error: code = NotFound desc = could not find container \"d4d4e5e6996266ccbb5991ed7666c810e25c3c2e730d544895b9bbe94153cd35\": container with ID starting with d4d4e5e6996266ccbb5991ed7666c810e25c3c2e730d544895b9bbe94153cd35 not found: ID does not exist" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.104177 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-546nt"] Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.113414 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-546nt"] Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.133997 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0818ea3-b629-47a3-8edb-d77e60a23068" path="/var/lib/kubelet/pods/a0818ea3-b629-47a3-8edb-d77e60a23068/volumes" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.134608 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca98c5df-cacc-4016-a941-f02ebe37bc01" path="/var/lib/kubelet/pods/ca98c5df-cacc-4016-a941-f02ebe37bc01/volumes" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492117 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6644f974c8-h2rxp"] Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.492427 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="560ff3a7-c3ed-4a54-80df-e7047ec1af42" containerName="registry-server" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492447 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="560ff3a7-c3ed-4a54-80df-e7047ec1af42" containerName="registry-server" Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.492472 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41421b22-7fd3-4c47-a406-1681ecda5d10" containerName="extract-content" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492487 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="41421b22-7fd3-4c47-a406-1681ecda5d10" containerName="extract-content" Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.492510 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41421b22-7fd3-4c47-a406-1681ecda5d10" containerName="extract-utilities" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492527 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="41421b22-7fd3-4c47-a406-1681ecda5d10" containerName="extract-utilities" Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.492547 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="560ff3a7-c3ed-4a54-80df-e7047ec1af42" containerName="extract-content" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492588 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="560ff3a7-c3ed-4a54-80df-e7047ec1af42" containerName="extract-content" Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.492614 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" containerName="extract-utilities" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492630 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" containerName="extract-utilities" Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.492653 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" containerName="registry-server" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492667 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" containerName="registry-server" Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.492689 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="560ff3a7-c3ed-4a54-80df-e7047ec1af42" containerName="extract-utilities" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492702 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="560ff3a7-c3ed-4a54-80df-e7047ec1af42" containerName="extract-utilities" Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.492717 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca98c5df-cacc-4016-a941-f02ebe37bc01" containerName="registry-server" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492729 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca98c5df-cacc-4016-a941-f02ebe37bc01" containerName="registry-server" Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.492745 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca98c5df-cacc-4016-a941-f02ebe37bc01" containerName="extract-utilities" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492757 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca98c5df-cacc-4016-a941-f02ebe37bc01" containerName="extract-utilities" Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.492774 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" containerName="extract-content" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492786 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" containerName="extract-content" Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.492800 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca98c5df-cacc-4016-a941-f02ebe37bc01" containerName="extract-content" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492812 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca98c5df-cacc-4016-a941-f02ebe37bc01" containerName="extract-content" Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.492829 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41421b22-7fd3-4c47-a406-1681ecda5d10" containerName="registry-server" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492842 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="41421b22-7fd3-4c47-a406-1681ecda5d10" containerName="registry-server" Dec 08 20:08:18 crc kubenswrapper[4781]: E1208 20:08:18.492858 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0818ea3-b629-47a3-8edb-d77e60a23068" containerName="oauth-openshift" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.492869 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0818ea3-b629-47a3-8edb-d77e60a23068" containerName="oauth-openshift" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.493056 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="aef6b08e-0164-4a0b-800b-c7a61bf2e35d" containerName="registry-server" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.493089 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="560ff3a7-c3ed-4a54-80df-e7047ec1af42" containerName="registry-server" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.493119 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0818ea3-b629-47a3-8edb-d77e60a23068" containerName="oauth-openshift" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.493139 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca98c5df-cacc-4016-a941-f02ebe37bc01" containerName="registry-server" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.493158 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="41421b22-7fd3-4c47-a406-1681ecda5d10" containerName="registry-server" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.493722 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.500291 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.500359 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.500091 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.501048 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.501205 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.501218 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.501573 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.504141 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.504159 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.504839 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.505010 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.505011 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.506266 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6644f974c8-h2rxp"] Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.512153 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.515199 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.519960 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623497 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623554 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623576 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623600 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623623 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w858l\" (UniqueName: \"kubernetes.io/projected/424ee651-36ff-48c0-9e13-996c95a16150-kube-api-access-w858l\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623649 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623672 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-router-certs\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623701 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-user-template-login\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623729 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-session\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623748 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/424ee651-36ff-48c0-9e13-996c95a16150-audit-dir\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623778 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/424ee651-36ff-48c0-9e13-996c95a16150-audit-policies\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623794 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-service-ca\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623811 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-user-template-error\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.623826 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.725381 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/424ee651-36ff-48c0-9e13-996c95a16150-audit-dir\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.725479 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/424ee651-36ff-48c0-9e13-996c95a16150-audit-policies\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.725501 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-service-ca\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.725509 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/424ee651-36ff-48c0-9e13-996c95a16150-audit-dir\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.726197 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-service-ca\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.725519 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-user-template-error\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.726270 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.726319 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.726341 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.726720 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.726751 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.726771 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w858l\" (UniqueName: \"kubernetes.io/projected/424ee651-36ff-48c0-9e13-996c95a16150-kube-api-access-w858l\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.726798 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.727243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/424ee651-36ff-48c0-9e13-996c95a16150-audit-policies\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.728003 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.728127 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-router-certs\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.730496 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-user-template-login\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.730657 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-session\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.730523 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-user-template-error\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.732251 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-router-certs\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.733181 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.733277 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.733304 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.733674 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.734024 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-user-template-login\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.734718 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-session\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.738407 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/424ee651-36ff-48c0-9e13-996c95a16150-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.749409 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w858l\" (UniqueName: \"kubernetes.io/projected/424ee651-36ff-48c0-9e13-996c95a16150-kube-api-access-w858l\") pod \"oauth-openshift-6644f974c8-h2rxp\" (UID: \"424ee651-36ff-48c0-9e13-996c95a16150\") " pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.822900 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:18 crc kubenswrapper[4781]: I1208 20:08:18.994822 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6644f974c8-h2rxp"] Dec 08 20:08:19 crc kubenswrapper[4781]: W1208 20:08:19.006558 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod424ee651_36ff_48c0_9e13_996c95a16150.slice/crio-e464b8c7afba6620cffcf40555c1a8ac2647b84441d3d48f795b88fa581b2d71 WatchSource:0}: Error finding container e464b8c7afba6620cffcf40555c1a8ac2647b84441d3d48f795b88fa581b2d71: Status 404 returned error can't find the container with id e464b8c7afba6620cffcf40555c1a8ac2647b84441d3d48f795b88fa581b2d71 Dec 08 20:08:19 crc kubenswrapper[4781]: I1208 20:08:19.084091 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" event={"ID":"424ee651-36ff-48c0-9e13-996c95a16150","Type":"ContainerStarted","Data":"e464b8c7afba6620cffcf40555c1a8ac2647b84441d3d48f795b88fa581b2d71"} Dec 08 20:08:20 crc kubenswrapper[4781]: I1208 20:08:20.092374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" event={"ID":"424ee651-36ff-48c0-9e13-996c95a16150","Type":"ContainerStarted","Data":"6f828b350ea130f99fe41e8daf5ee53030e2d84b05a30bfa0105f0c99f4eb3a6"} Dec 08 20:08:20 crc kubenswrapper[4781]: I1208 20:08:20.092867 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:20 crc kubenswrapper[4781]: I1208 20:08:20.100536 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" Dec 08 20:08:20 crc kubenswrapper[4781]: I1208 20:08:20.121319 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6644f974c8-h2rxp" podStartSLOduration=28.121302741 podStartE2EDuration="28.121302741s" podCreationTimestamp="2025-12-08 20:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:08:20.119884861 +0000 UTC m=+216.271168258" watchObservedRunningTime="2025-12-08 20:08:20.121302741 +0000 UTC m=+216.272586128" Dec 08 20:08:29 crc kubenswrapper[4781]: I1208 20:08:29.948574 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:08:29 crc kubenswrapper[4781]: I1208 20:08:29.949233 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:08:29 crc kubenswrapper[4781]: I1208 20:08:29.949470 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:08:29 crc kubenswrapper[4781]: I1208 20:08:29.950188 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 20:08:29 crc kubenswrapper[4781]: I1208 20:08:29.950257 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01" gracePeriod=600 Dec 08 20:08:30 crc kubenswrapper[4781]: I1208 20:08:30.178157 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01" exitCode=0 Dec 08 20:08:30 crc kubenswrapper[4781]: I1208 20:08:30.178329 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01"} Dec 08 20:08:31 crc kubenswrapper[4781]: I1208 20:08:31.186037 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"35d2e280105962572e3cb8a545c5f1fabd05635f164199f39e55da29ab8b26d5"} Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.482416 4781 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.483298 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d" gracePeriod=15 Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.483343 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884" gracePeriod=15 Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.483378 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2" gracePeriod=15 Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.483368 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34" gracePeriod=15 Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.483375 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01" gracePeriod=15 Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.486292 4781 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 08 20:08:32 crc kubenswrapper[4781]: E1208 20:08:32.486631 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.486727 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 08 20:08:32 crc kubenswrapper[4781]: E1208 20:08:32.486810 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.486898 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 08 20:08:32 crc kubenswrapper[4781]: E1208 20:08:32.487015 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.487096 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 08 20:08:32 crc kubenswrapper[4781]: E1208 20:08:32.487183 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.487257 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 08 20:08:32 crc kubenswrapper[4781]: E1208 20:08:32.487333 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.487519 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 08 20:08:32 crc kubenswrapper[4781]: E1208 20:08:32.487619 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.487694 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.488126 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.488232 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.488315 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.488396 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.488472 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.488562 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 08 20:08:32 crc kubenswrapper[4781]: E1208 20:08:32.488762 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.488852 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.490430 4781 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.491289 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.495820 4781 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 08 20:08:32 crc kubenswrapper[4781]: E1208 20:08:32.561154 4781 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.159:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.616069 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.616129 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.616199 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.616235 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.616294 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.616393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.616457 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.616489 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.717992 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718061 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718126 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718156 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718135 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718193 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718255 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718269 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718299 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718313 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718328 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718343 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718374 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718408 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718494 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.718581 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: I1208 20:08:32.862238 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:32 crc kubenswrapper[4781]: E1208 20:08:32.889774 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.159:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f565787906fa8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-08 20:08:32.889057192 +0000 UTC m=+229.040340569,LastTimestamp:2025-12-08 20:08:32.889057192 +0000 UTC m=+229.040340569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 08 20:08:33 crc kubenswrapper[4781]: I1208 20:08:33.196972 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 08 20:08:33 crc kubenswrapper[4781]: I1208 20:08:33.198424 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 08 20:08:33 crc kubenswrapper[4781]: I1208 20:08:33.199106 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884" exitCode=0 Dec 08 20:08:33 crc kubenswrapper[4781]: I1208 20:08:33.199132 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2" exitCode=0 Dec 08 20:08:33 crc kubenswrapper[4781]: I1208 20:08:33.199144 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01" exitCode=0 Dec 08 20:08:33 crc kubenswrapper[4781]: I1208 20:08:33.199154 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34" exitCode=2 Dec 08 20:08:33 crc kubenswrapper[4781]: I1208 20:08:33.199192 4781 scope.go:117] "RemoveContainer" containerID="3478d5d98d6915ccb9d8bb4cbeeaeb0f587c93509ad81a377bd4bd14553102bb" Dec 08 20:08:33 crc kubenswrapper[4781]: I1208 20:08:33.200581 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6e509e786bc1e922df0b3fec826a75ca8f79fc237b3145c306bb3e8d6d06757d"} Dec 08 20:08:33 crc kubenswrapper[4781]: I1208 20:08:33.200623 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0e8671550f00a47d1737202a57e5aa5ba9ba7c70178409ea8028b4520f7e804e"} Dec 08 20:08:33 crc kubenswrapper[4781]: E1208 20:08:33.201524 4781 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.159:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:08:33 crc kubenswrapper[4781]: I1208 20:08:33.203002 4781 generic.go:334] "Generic (PLEG): container finished" podID="00264077-cf52-457f-80f7-0d149562535a" containerID="d1acdf6a7f9fe90ec06f15f6c8f7a2d2ea52271ace199b6ed17a5c6338eb4e77" exitCode=0 Dec 08 20:08:33 crc kubenswrapper[4781]: I1208 20:08:33.203034 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"00264077-cf52-457f-80f7-0d149562535a","Type":"ContainerDied","Data":"d1acdf6a7f9fe90ec06f15f6c8f7a2d2ea52271ace199b6ed17a5c6338eb4e77"} Dec 08 20:08:33 crc kubenswrapper[4781]: I1208 20:08:33.203718 4781 status_manager.go:851] "Failed to get status for pod" podUID="00264077-cf52-457f-80f7-0d149562535a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.129311 4781 status_manager.go:851] "Failed to get status for pod" podUID="00264077-cf52-457f-80f7-0d149562535a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.211496 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 08 20:08:34 crc kubenswrapper[4781]: E1208 20:08:34.288149 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.159:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f565787906fa8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-08 20:08:32.889057192 +0000 UTC m=+229.040340569,LastTimestamp:2025-12-08 20:08:32.889057192 +0000 UTC m=+229.040340569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.449471 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.450350 4781 status_manager.go:851] "Failed to get status for pod" podUID="00264077-cf52-457f-80f7-0d149562535a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.638905 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/00264077-cf52-457f-80f7-0d149562535a-var-lock\") pod \"00264077-cf52-457f-80f7-0d149562535a\" (UID: \"00264077-cf52-457f-80f7-0d149562535a\") " Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.639333 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00264077-cf52-457f-80f7-0d149562535a-kubelet-dir\") pod \"00264077-cf52-457f-80f7-0d149562535a\" (UID: \"00264077-cf52-457f-80f7-0d149562535a\") " Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.639093 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00264077-cf52-457f-80f7-0d149562535a-var-lock" (OuterVolumeSpecName: "var-lock") pod "00264077-cf52-457f-80f7-0d149562535a" (UID: "00264077-cf52-457f-80f7-0d149562535a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.639438 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00264077-cf52-457f-80f7-0d149562535a-kube-api-access\") pod \"00264077-cf52-457f-80f7-0d149562535a\" (UID: \"00264077-cf52-457f-80f7-0d149562535a\") " Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.639519 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00264077-cf52-457f-80f7-0d149562535a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "00264077-cf52-457f-80f7-0d149562535a" (UID: "00264077-cf52-457f-80f7-0d149562535a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.639724 4781 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/00264077-cf52-457f-80f7-0d149562535a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.639745 4781 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/00264077-cf52-457f-80f7-0d149562535a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.645561 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00264077-cf52-457f-80f7-0d149562535a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "00264077-cf52-457f-80f7-0d149562535a" (UID: "00264077-cf52-457f-80f7-0d149562535a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.740629 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/00264077-cf52-457f-80f7-0d149562535a-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.864798 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.865569 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.866512 4781 status_manager.go:851] "Failed to get status for pod" podUID="00264077-cf52-457f-80f7-0d149562535a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:34 crc kubenswrapper[4781]: I1208 20:08:34.866996 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.043973 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.044034 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.044063 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.044093 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.044141 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.044212 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.044330 4781 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.044343 4781 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.044351 4781 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.218649 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.218663 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"00264077-cf52-457f-80f7-0d149562535a","Type":"ContainerDied","Data":"9269ff92410a08fc13af558bd0bff1adaf355e1e75002ef7854320c3dc936646"} Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.218706 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9269ff92410a08fc13af558bd0bff1adaf355e1e75002ef7854320c3dc936646" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.221345 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.222057 4781 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d" exitCode=0 Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.222117 4781 scope.go:117] "RemoveContainer" containerID="e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.222138 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.232847 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.233349 4781 status_manager.go:851] "Failed to get status for pod" podUID="00264077-cf52-457f-80f7-0d149562535a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.237485 4781 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.238271 4781 status_manager.go:851] "Failed to get status for pod" podUID="00264077-cf52-457f-80f7-0d149562535a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.240417 4781 scope.go:117] "RemoveContainer" containerID="b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.257388 4781 scope.go:117] "RemoveContainer" containerID="03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.272863 4781 scope.go:117] "RemoveContainer" containerID="002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.284943 4781 scope.go:117] "RemoveContainer" containerID="06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.298005 4781 scope.go:117] "RemoveContainer" containerID="4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.320824 4781 scope.go:117] "RemoveContainer" containerID="e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884" Dec 08 20:08:35 crc kubenswrapper[4781]: E1208 20:08:35.321315 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\": container with ID starting with e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884 not found: ID does not exist" containerID="e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.321366 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884"} err="failed to get container status \"e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\": rpc error: code = NotFound desc = could not find container \"e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884\": container with ID starting with e05002955aaffae3c239ec24524638f388766befbe5d8d225288169af6218884 not found: ID does not exist" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.321405 4781 scope.go:117] "RemoveContainer" containerID="b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2" Dec 08 20:08:35 crc kubenswrapper[4781]: E1208 20:08:35.321780 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\": container with ID starting with b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2 not found: ID does not exist" containerID="b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.321821 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2"} err="failed to get container status \"b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\": rpc error: code = NotFound desc = could not find container \"b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2\": container with ID starting with b46364df22a79939b2f35adc8a0e65941d6aa28c4ff01384710b6861226ed9b2 not found: ID does not exist" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.321850 4781 scope.go:117] "RemoveContainer" containerID="03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01" Dec 08 20:08:35 crc kubenswrapper[4781]: E1208 20:08:35.322470 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\": container with ID starting with 03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01 not found: ID does not exist" containerID="03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.322493 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01"} err="failed to get container status \"03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\": rpc error: code = NotFound desc = could not find container \"03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01\": container with ID starting with 03e1355b6665ec53117282e8d4dd31cdfb73f852cdb4286585082fa0ad5cba01 not found: ID does not exist" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.322508 4781 scope.go:117] "RemoveContainer" containerID="002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34" Dec 08 20:08:35 crc kubenswrapper[4781]: E1208 20:08:35.322823 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\": container with ID starting with 002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34 not found: ID does not exist" containerID="002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.322843 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34"} err="failed to get container status \"002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\": rpc error: code = NotFound desc = could not find container \"002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34\": container with ID starting with 002623d9890d60094cda47a828e4d99e9e70ff209ced279e6fe5f92a73482b34 not found: ID does not exist" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.322857 4781 scope.go:117] "RemoveContainer" containerID="06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d" Dec 08 20:08:35 crc kubenswrapper[4781]: E1208 20:08:35.323250 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\": container with ID starting with 06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d not found: ID does not exist" containerID="06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.323283 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d"} err="failed to get container status \"06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\": rpc error: code = NotFound desc = could not find container \"06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d\": container with ID starting with 06f49715f76dced1ddc5c50315edf1fce7070ddba02d04ba36bd17388619730d not found: ID does not exist" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.323302 4781 scope.go:117] "RemoveContainer" containerID="4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b" Dec 08 20:08:35 crc kubenswrapper[4781]: E1208 20:08:35.323609 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\": container with ID starting with 4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b not found: ID does not exist" containerID="4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b" Dec 08 20:08:35 crc kubenswrapper[4781]: I1208 20:08:35.323633 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b"} err="failed to get container status \"4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\": rpc error: code = NotFound desc = could not find container \"4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b\": container with ID starting with 4d2136af3365b35164033933592ab939c00037028c28764f2a8541dbc5a8588b not found: ID does not exist" Dec 08 20:08:36 crc kubenswrapper[4781]: I1208 20:08:36.132824 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 08 20:08:37 crc kubenswrapper[4781]: E1208 20:08:37.511653 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:37 crc kubenswrapper[4781]: E1208 20:08:37.512146 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:37 crc kubenswrapper[4781]: E1208 20:08:37.512487 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:37 crc kubenswrapper[4781]: E1208 20:08:37.512842 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:37 crc kubenswrapper[4781]: E1208 20:08:37.513076 4781 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:37 crc kubenswrapper[4781]: I1208 20:08:37.513111 4781 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 08 20:08:37 crc kubenswrapper[4781]: E1208 20:08:37.513382 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="200ms" Dec 08 20:08:37 crc kubenswrapper[4781]: E1208 20:08:37.714078 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="400ms" Dec 08 20:08:38 crc kubenswrapper[4781]: E1208 20:08:38.115823 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="800ms" Dec 08 20:08:38 crc kubenswrapper[4781]: E1208 20:08:38.916211 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="1.6s" Dec 08 20:08:40 crc kubenswrapper[4781]: E1208 20:08:40.518071 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="3.2s" Dec 08 20:08:43 crc kubenswrapper[4781]: E1208 20:08:43.719622 4781 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.159:6443: connect: connection refused" interval="6.4s" Dec 08 20:08:44 crc kubenswrapper[4781]: I1208 20:08:44.150075 4781 status_manager.go:851] "Failed to get status for pod" podUID="00264077-cf52-457f-80f7-0d149562535a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:44 crc kubenswrapper[4781]: E1208 20:08:44.288851 4781 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.159:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187f565787906fa8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-08 20:08:32.889057192 +0000 UTC m=+229.040340569,LastTimestamp:2025-12-08 20:08:32.889057192 +0000 UTC m=+229.040340569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 08 20:08:46 crc kubenswrapper[4781]: I1208 20:08:46.278639 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 08 20:08:46 crc kubenswrapper[4781]: I1208 20:08:46.279020 4781 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b" exitCode=1 Dec 08 20:08:46 crc kubenswrapper[4781]: I1208 20:08:46.279050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b"} Dec 08 20:08:46 crc kubenswrapper[4781]: I1208 20:08:46.279428 4781 scope.go:117] "RemoveContainer" containerID="fd19630544c82b6b9ef345c71ba87f054b965d3628767c1d01c87d9fa8c51b0b" Dec 08 20:08:46 crc kubenswrapper[4781]: I1208 20:08:46.280076 4781 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:46 crc kubenswrapper[4781]: I1208 20:08:46.280399 4781 status_manager.go:851] "Failed to get status for pod" podUID="00264077-cf52-457f-80f7-0d149562535a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:47 crc kubenswrapper[4781]: I1208 20:08:47.125553 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:47 crc kubenswrapper[4781]: I1208 20:08:47.126735 4781 status_manager.go:851] "Failed to get status for pod" podUID="00264077-cf52-457f-80f7-0d149562535a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:47 crc kubenswrapper[4781]: I1208 20:08:47.127995 4781 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:47 crc kubenswrapper[4781]: I1208 20:08:47.148522 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8304801-0d0d-4205-855d-777341abe76d" Dec 08 20:08:47 crc kubenswrapper[4781]: I1208 20:08:47.148568 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8304801-0d0d-4205-855d-777341abe76d" Dec 08 20:08:47 crc kubenswrapper[4781]: E1208 20:08:47.149045 4781 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:47 crc kubenswrapper[4781]: I1208 20:08:47.149644 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:47 crc kubenswrapper[4781]: W1208 20:08:47.186426 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-f324a32b17807ac26ee6cfa3657d2507fb76c993ed35769b3af53a9b6548c286 WatchSource:0}: Error finding container f324a32b17807ac26ee6cfa3657d2507fb76c993ed35769b3af53a9b6548c286: Status 404 returned error can't find the container with id f324a32b17807ac26ee6cfa3657d2507fb76c993ed35769b3af53a9b6548c286 Dec 08 20:08:47 crc kubenswrapper[4781]: I1208 20:08:47.288237 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f324a32b17807ac26ee6cfa3657d2507fb76c993ed35769b3af53a9b6548c286"} Dec 08 20:08:47 crc kubenswrapper[4781]: I1208 20:08:47.291482 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 08 20:08:47 crc kubenswrapper[4781]: I1208 20:08:47.291521 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"67c1fc9f1c1774f0c6ee9827bb0373c998b4ad22ce88d7e4b84c009ddd4df868"} Dec 08 20:08:47 crc kubenswrapper[4781]: I1208 20:08:47.292339 4781 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:47 crc kubenswrapper[4781]: I1208 20:08:47.292887 4781 status_manager.go:851] "Failed to get status for pod" podUID="00264077-cf52-457f-80f7-0d149562535a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:48 crc kubenswrapper[4781]: I1208 20:08:48.301533 4781 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="edf11d5e7dc3cc6cd75dee8f80fde7949570b50b452bb9a8764559ad80a1c64d" exitCode=0 Dec 08 20:08:48 crc kubenswrapper[4781]: I1208 20:08:48.301611 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"edf11d5e7dc3cc6cd75dee8f80fde7949570b50b452bb9a8764559ad80a1c64d"} Dec 08 20:08:48 crc kubenswrapper[4781]: I1208 20:08:48.301956 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8304801-0d0d-4205-855d-777341abe76d" Dec 08 20:08:48 crc kubenswrapper[4781]: I1208 20:08:48.302003 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8304801-0d0d-4205-855d-777341abe76d" Dec 08 20:08:48 crc kubenswrapper[4781]: I1208 20:08:48.302725 4781 status_manager.go:851] "Failed to get status for pod" podUID="00264077-cf52-457f-80f7-0d149562535a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:48 crc kubenswrapper[4781]: E1208 20:08:48.302809 4781 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:48 crc kubenswrapper[4781]: I1208 20:08:48.303247 4781 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.159:6443: connect: connection refused" Dec 08 20:08:48 crc kubenswrapper[4781]: I1208 20:08:48.940854 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:08:48 crc kubenswrapper[4781]: I1208 20:08:48.941238 4781 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 08 20:08:48 crc kubenswrapper[4781]: I1208 20:08:48.941311 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 08 20:08:49 crc kubenswrapper[4781]: I1208 20:08:49.314240 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"18d051837f6133d619ab08956a1a5a657df85737f3efbf7f630633840626f75a"} Dec 08 20:08:49 crc kubenswrapper[4781]: I1208 20:08:49.314288 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"26aa15696fe010e9884a8da7f1847b73d1b6a7359e680b1f4c4215540c62e2ba"} Dec 08 20:08:49 crc kubenswrapper[4781]: I1208 20:08:49.314302 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5c468acdd2f882981ec98f66c55447e269c92ed254f1adf1714342fdcbce476e"} Dec 08 20:08:49 crc kubenswrapper[4781]: I1208 20:08:49.314312 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9116b45524d16a0217e51642ed2f2b8e4aadb0202b27620a53e3acb144577292"} Dec 08 20:08:50 crc kubenswrapper[4781]: I1208 20:08:50.323824 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fb993c3b7b3a0ba43ceff55806a7fabfa2d53fae5a041b9ee68cdf2b55b97247"} Dec 08 20:08:50 crc kubenswrapper[4781]: I1208 20:08:50.325211 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:50 crc kubenswrapper[4781]: I1208 20:08:50.324855 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8304801-0d0d-4205-855d-777341abe76d" Dec 08 20:08:50 crc kubenswrapper[4781]: I1208 20:08:50.325466 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8304801-0d0d-4205-855d-777341abe76d" Dec 08 20:08:51 crc kubenswrapper[4781]: I1208 20:08:51.346687 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:08:52 crc kubenswrapper[4781]: I1208 20:08:52.150880 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:52 crc kubenswrapper[4781]: I1208 20:08:52.151216 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:52 crc kubenswrapper[4781]: I1208 20:08:52.157312 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:55 crc kubenswrapper[4781]: I1208 20:08:55.333733 4781 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:55 crc kubenswrapper[4781]: I1208 20:08:55.350960 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8304801-0d0d-4205-855d-777341abe76d" Dec 08 20:08:55 crc kubenswrapper[4781]: I1208 20:08:55.350985 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8304801-0d0d-4205-855d-777341abe76d" Dec 08 20:08:55 crc kubenswrapper[4781]: I1208 20:08:55.355249 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:08:55 crc kubenswrapper[4781]: I1208 20:08:55.381708 4781 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="91297f46-b9a6-4e47-8f25-e8662a0e312b" Dec 08 20:08:56 crc kubenswrapper[4781]: I1208 20:08:56.355985 4781 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8304801-0d0d-4205-855d-777341abe76d" Dec 08 20:08:56 crc kubenswrapper[4781]: I1208 20:08:56.356018 4781 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a8304801-0d0d-4205-855d-777341abe76d" Dec 08 20:08:56 crc kubenswrapper[4781]: I1208 20:08:56.359149 4781 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="91297f46-b9a6-4e47-8f25-e8662a0e312b" Dec 08 20:08:58 crc kubenswrapper[4781]: I1208 20:08:58.943961 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:08:58 crc kubenswrapper[4781]: I1208 20:08:58.949665 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 08 20:09:04 crc kubenswrapper[4781]: I1208 20:09:04.813395 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 08 20:09:05 crc kubenswrapper[4781]: I1208 20:09:05.766668 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.189346 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.304711 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.363525 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.368852 4781 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.392080 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.402481 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.414530 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.565320 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.659268 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.745458 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.879821 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.929103 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.954655 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 08 20:09:06 crc kubenswrapper[4781]: I1208 20:09:06.969145 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 08 20:09:07 crc kubenswrapper[4781]: I1208 20:09:07.058217 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 08 20:09:07 crc kubenswrapper[4781]: I1208 20:09:07.098847 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 08 20:09:07 crc kubenswrapper[4781]: I1208 20:09:07.413369 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 08 20:09:07 crc kubenswrapper[4781]: I1208 20:09:07.485706 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 08 20:09:07 crc kubenswrapper[4781]: I1208 20:09:07.548479 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 08 20:09:07 crc kubenswrapper[4781]: I1208 20:09:07.566116 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 08 20:09:07 crc kubenswrapper[4781]: I1208 20:09:07.631660 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 08 20:09:07 crc kubenswrapper[4781]: I1208 20:09:07.688914 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 08 20:09:07 crc kubenswrapper[4781]: I1208 20:09:07.790102 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 08 20:09:07 crc kubenswrapper[4781]: I1208 20:09:07.872262 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 08 20:09:07 crc kubenswrapper[4781]: I1208 20:09:07.984587 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 08 20:09:07 crc kubenswrapper[4781]: I1208 20:09:07.991753 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.007905 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.011181 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.152633 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.326668 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.431562 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.434463 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.606752 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.629301 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.694484 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.825339 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.865372 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.866787 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.964023 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.969539 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 08 20:09:08 crc kubenswrapper[4781]: I1208 20:09:08.977695 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.035953 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.047184 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.167518 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.199345 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.285289 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.302533 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.334552 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.379705 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.418620 4781 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.427802 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.489009 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.577734 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.620041 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.621173 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.764052 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.779571 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.796873 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 08 20:09:09 crc kubenswrapper[4781]: I1208 20:09:09.901393 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.010320 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.011084 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.050285 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.064530 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.075826 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.082708 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.135506 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.175417 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.263290 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.290867 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.291480 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.538159 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.547081 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.586740 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.609781 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.648302 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.661208 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.719752 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.778729 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.818027 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.822134 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.873857 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.956529 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.963553 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.963613 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 08 20:09:10 crc kubenswrapper[4781]: I1208 20:09:10.984398 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.003242 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.056651 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.063334 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.098638 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.156887 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.178394 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.180102 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.191621 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.240848 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.251035 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.367353 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.576632 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.603858 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.748909 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.790756 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.812162 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.921880 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.921893 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 08 20:09:11 crc kubenswrapper[4781]: I1208 20:09:11.985369 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 08 20:09:12 crc kubenswrapper[4781]: I1208 20:09:12.055654 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 08 20:09:12 crc kubenswrapper[4781]: I1208 20:09:12.153341 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 08 20:09:12 crc kubenswrapper[4781]: I1208 20:09:12.191537 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 08 20:09:12 crc kubenswrapper[4781]: I1208 20:09:12.440475 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 08 20:09:12 crc kubenswrapper[4781]: I1208 20:09:12.442780 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 08 20:09:12 crc kubenswrapper[4781]: I1208 20:09:12.593260 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 08 20:09:12 crc kubenswrapper[4781]: I1208 20:09:12.610773 4781 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 08 20:09:12 crc kubenswrapper[4781]: I1208 20:09:12.669172 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 08 20:09:12 crc kubenswrapper[4781]: I1208 20:09:12.884231 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 08 20:09:12 crc kubenswrapper[4781]: I1208 20:09:12.949907 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 08 20:09:12 crc kubenswrapper[4781]: I1208 20:09:12.959690 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.204769 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.308790 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.309177 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.321682 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.423316 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.425852 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.427574 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.432950 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.469781 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.514451 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.586579 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.618932 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.654964 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.694428 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.739156 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.792769 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.812502 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.812698 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.855650 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 08 20:09:13 crc kubenswrapper[4781]: I1208 20:09:13.887340 4781 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.107069 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.211059 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.261439 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.292189 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.345466 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.358970 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.369549 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.374600 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.418582 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.427590 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.440785 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.450426 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.518709 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.528576 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.587055 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.651199 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.665438 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.832061 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.957813 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 08 20:09:14 crc kubenswrapper[4781]: I1208 20:09:14.987655 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.080254 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.096413 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.099550 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.153838 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.171911 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.203993 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.268222 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.291472 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.296484 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.359410 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.395667 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.428461 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.444452 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.495470 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.500143 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.515035 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.571495 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.584395 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.615633 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.662441 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.748979 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 08 20:09:15 crc kubenswrapper[4781]: I1208 20:09:15.961742 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.024536 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.047057 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.064215 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.147761 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.176104 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.265045 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.269444 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.330323 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.401854 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.510988 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.532373 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.586879 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.767890 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.816881 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.850694 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.882757 4781 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 08 20:09:16 crc kubenswrapper[4781]: I1208 20:09:16.971363 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.012261 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.103324 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.120977 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.146803 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.325896 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.375533 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.447480 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.497054 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.522345 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.591046 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.593322 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.611566 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.616851 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.639453 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.676584 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 08 20:09:17 crc kubenswrapper[4781]: I1208 20:09:17.708031 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 08 20:09:18 crc kubenswrapper[4781]: I1208 20:09:18.119372 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 08 20:09:18 crc kubenswrapper[4781]: I1208 20:09:18.394024 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 08 20:09:18 crc kubenswrapper[4781]: I1208 20:09:18.412175 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 08 20:09:18 crc kubenswrapper[4781]: I1208 20:09:18.533618 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 08 20:09:18 crc kubenswrapper[4781]: I1208 20:09:18.596558 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 08 20:09:18 crc kubenswrapper[4781]: I1208 20:09:18.698730 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 08 20:09:18 crc kubenswrapper[4781]: I1208 20:09:18.747513 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 08 20:09:18 crc kubenswrapper[4781]: I1208 20:09:18.754429 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 08 20:09:18 crc kubenswrapper[4781]: I1208 20:09:18.850322 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 08 20:09:18 crc kubenswrapper[4781]: I1208 20:09:18.875935 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 08 20:09:18 crc kubenswrapper[4781]: I1208 20:09:18.930988 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 08 20:09:19 crc kubenswrapper[4781]: I1208 20:09:19.014185 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 08 20:09:19 crc kubenswrapper[4781]: I1208 20:09:19.064166 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 08 20:09:19 crc kubenswrapper[4781]: I1208 20:09:19.129007 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 08 20:09:19 crc kubenswrapper[4781]: I1208 20:09:19.418341 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 08 20:09:19 crc kubenswrapper[4781]: I1208 20:09:19.440730 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 08 20:09:19 crc kubenswrapper[4781]: I1208 20:09:19.501183 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 08 20:09:19 crc kubenswrapper[4781]: I1208 20:09:19.570374 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 08 20:09:19 crc kubenswrapper[4781]: I1208 20:09:19.581990 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 08 20:09:19 crc kubenswrapper[4781]: I1208 20:09:19.664403 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 08 20:09:19 crc kubenswrapper[4781]: I1208 20:09:19.681823 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 08 20:09:19 crc kubenswrapper[4781]: I1208 20:09:19.777456 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 08 20:09:19 crc kubenswrapper[4781]: I1208 20:09:19.832396 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 08 20:09:19 crc kubenswrapper[4781]: I1208 20:09:19.990118 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.053983 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.062989 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.102622 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.166656 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.256464 4781 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.260689 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.260764 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.261503 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.264760 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.278400 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.27838051 podStartE2EDuration="25.27838051s" podCreationTimestamp="2025-12-08 20:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:09:20.277284228 +0000 UTC m=+276.428567605" watchObservedRunningTime="2025-12-08 20:09:20.27838051 +0000 UTC m=+276.429663897" Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.437015 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.478941 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.580678 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 08 20:09:20 crc kubenswrapper[4781]: I1208 20:09:20.825427 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 08 20:09:21 crc kubenswrapper[4781]: I1208 20:09:21.310501 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 08 20:09:21 crc kubenswrapper[4781]: I1208 20:09:21.794986 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 08 20:09:22 crc kubenswrapper[4781]: I1208 20:09:22.192410 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 08 20:09:28 crc kubenswrapper[4781]: I1208 20:09:28.803357 4781 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 08 20:09:28 crc kubenswrapper[4781]: I1208 20:09:28.805044 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://6e509e786bc1e922df0b3fec826a75ca8f79fc237b3145c306bb3e8d6d06757d" gracePeriod=5 Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.129996 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ghptn"] Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.130614 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ghptn" podUID="924b676c-8556-4d07-bf4b-a3607f3780d1" containerName="registry-server" containerID="cri-o://35afeb83e9568984eaca0bbb6cdfc93d2ba638e677159253be409ecc7ac52ea8" gracePeriod=30 Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.154550 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d2rbv"] Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.154971 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d2rbv" podUID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" containerName="registry-server" containerID="cri-o://33adac9c2cbef41f0756bc8f865026136409054b83fadd53ebea6a6315d4fc67" gracePeriod=30 Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.180001 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7lss8"] Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.180334 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" podUID="959c7256-02a6-47bb-9b32-64387b359e95" containerName="marketplace-operator" containerID="cri-o://505d31ce1521e6161c544539096a8dc286d1e53b29759340e78e0a58d236ed05" gracePeriod=30 Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.190038 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vbq9n"] Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.190450 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vbq9n" podUID="ee128c5f-b1ca-4f73-b1db-d643edb27970" containerName="registry-server" containerID="cri-o://99f98c15bf28875ae7ed9c178bcaca9e61865d34e8155a76b3e1c6298bd380a0" gracePeriod=30 Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.194169 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lhcz4"] Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.194383 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lhcz4" podUID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" containerName="registry-server" containerID="cri-o://1e78d58c609646958ddea68029dd8ab53376d361c0b4f0e5ca17ae0ea62da878" gracePeriod=30 Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.206603 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w8dwv"] Dec 08 20:09:29 crc kubenswrapper[4781]: E1208 20:09:29.206873 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00264077-cf52-457f-80f7-0d149562535a" containerName="installer" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.206895 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="00264077-cf52-457f-80f7-0d149562535a" containerName="installer" Dec 08 20:09:29 crc kubenswrapper[4781]: E1208 20:09:29.206906 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.206914 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.207079 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="00264077-cf52-457f-80f7-0d149562535a" containerName="installer" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.207099 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.207576 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.214486 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w8dwv"] Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.276781 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81d63467-e009-4c3d-8391-9f034f2da751-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w8dwv\" (UID: \"81d63467-e009-4c3d-8391-9f034f2da751\") " pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.277073 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/81d63467-e009-4c3d-8391-9f034f2da751-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w8dwv\" (UID: \"81d63467-e009-4c3d-8391-9f034f2da751\") " pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.277113 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-877lp\" (UniqueName: \"kubernetes.io/projected/81d63467-e009-4c3d-8391-9f034f2da751-kube-api-access-877lp\") pod \"marketplace-operator-79b997595-w8dwv\" (UID: \"81d63467-e009-4c3d-8391-9f034f2da751\") " pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" Dec 08 20:09:29 crc kubenswrapper[4781]: E1208 20:09:29.301393 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 33adac9c2cbef41f0756bc8f865026136409054b83fadd53ebea6a6315d4fc67 is running failed: container process not found" containerID="33adac9c2cbef41f0756bc8f865026136409054b83fadd53ebea6a6315d4fc67" cmd=["grpc_health_probe","-addr=:50051"] Dec 08 20:09:29 crc kubenswrapper[4781]: E1208 20:09:29.301942 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 33adac9c2cbef41f0756bc8f865026136409054b83fadd53ebea6a6315d4fc67 is running failed: container process not found" containerID="33adac9c2cbef41f0756bc8f865026136409054b83fadd53ebea6a6315d4fc67" cmd=["grpc_health_probe","-addr=:50051"] Dec 08 20:09:29 crc kubenswrapper[4781]: E1208 20:09:29.302578 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 33adac9c2cbef41f0756bc8f865026136409054b83fadd53ebea6a6315d4fc67 is running failed: container process not found" containerID="33adac9c2cbef41f0756bc8f865026136409054b83fadd53ebea6a6315d4fc67" cmd=["grpc_health_probe","-addr=:50051"] Dec 08 20:09:29 crc kubenswrapper[4781]: E1208 20:09:29.302622 4781 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 33adac9c2cbef41f0756bc8f865026136409054b83fadd53ebea6a6315d4fc67 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-d2rbv" podUID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" containerName="registry-server" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.378587 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81d63467-e009-4c3d-8391-9f034f2da751-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w8dwv\" (UID: \"81d63467-e009-4c3d-8391-9f034f2da751\") " pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.378651 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/81d63467-e009-4c3d-8391-9f034f2da751-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w8dwv\" (UID: \"81d63467-e009-4c3d-8391-9f034f2da751\") " pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.378692 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-877lp\" (UniqueName: \"kubernetes.io/projected/81d63467-e009-4c3d-8391-9f034f2da751-kube-api-access-877lp\") pod \"marketplace-operator-79b997595-w8dwv\" (UID: \"81d63467-e009-4c3d-8391-9f034f2da751\") " pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.380862 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81d63467-e009-4c3d-8391-9f034f2da751-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w8dwv\" (UID: \"81d63467-e009-4c3d-8391-9f034f2da751\") " pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.384164 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/81d63467-e009-4c3d-8391-9f034f2da751-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w8dwv\" (UID: \"81d63467-e009-4c3d-8391-9f034f2da751\") " pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.407585 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-877lp\" (UniqueName: \"kubernetes.io/projected/81d63467-e009-4c3d-8391-9f034f2da751-kube-api-access-877lp\") pod \"marketplace-operator-79b997595-w8dwv\" (UID: \"81d63467-e009-4c3d-8391-9f034f2da751\") " pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" Dec 08 20:09:29 crc kubenswrapper[4781]: E1208 20:09:29.514619 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 35afeb83e9568984eaca0bbb6cdfc93d2ba638e677159253be409ecc7ac52ea8 is running failed: container process not found" containerID="35afeb83e9568984eaca0bbb6cdfc93d2ba638e677159253be409ecc7ac52ea8" cmd=["grpc_health_probe","-addr=:50051"] Dec 08 20:09:29 crc kubenswrapper[4781]: E1208 20:09:29.514887 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 35afeb83e9568984eaca0bbb6cdfc93d2ba638e677159253be409ecc7ac52ea8 is running failed: container process not found" containerID="35afeb83e9568984eaca0bbb6cdfc93d2ba638e677159253be409ecc7ac52ea8" cmd=["grpc_health_probe","-addr=:50051"] Dec 08 20:09:29 crc kubenswrapper[4781]: E1208 20:09:29.515112 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 35afeb83e9568984eaca0bbb6cdfc93d2ba638e677159253be409ecc7ac52ea8 is running failed: container process not found" containerID="35afeb83e9568984eaca0bbb6cdfc93d2ba638e677159253be409ecc7ac52ea8" cmd=["grpc_health_probe","-addr=:50051"] Dec 08 20:09:29 crc kubenswrapper[4781]: E1208 20:09:29.515135 4781 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 35afeb83e9568984eaca0bbb6cdfc93d2ba638e677159253be409ecc7ac52ea8 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-ghptn" podUID="924b676c-8556-4d07-bf4b-a3607f3780d1" containerName="registry-server" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.537496 4781 generic.go:334] "Generic (PLEG): container finished" podID="924b676c-8556-4d07-bf4b-a3607f3780d1" containerID="35afeb83e9568984eaca0bbb6cdfc93d2ba638e677159253be409ecc7ac52ea8" exitCode=0 Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.537615 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghptn" event={"ID":"924b676c-8556-4d07-bf4b-a3607f3780d1","Type":"ContainerDied","Data":"35afeb83e9568984eaca0bbb6cdfc93d2ba638e677159253be409ecc7ac52ea8"} Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.544499 4781 generic.go:334] "Generic (PLEG): container finished" podID="ee128c5f-b1ca-4f73-b1db-d643edb27970" containerID="99f98c15bf28875ae7ed9c178bcaca9e61865d34e8155a76b3e1c6298bd380a0" exitCode=0 Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.544630 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vbq9n" event={"ID":"ee128c5f-b1ca-4f73-b1db-d643edb27970","Type":"ContainerDied","Data":"99f98c15bf28875ae7ed9c178bcaca9e61865d34e8155a76b3e1c6298bd380a0"} Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.546126 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.560508 4781 generic.go:334] "Generic (PLEG): container finished" podID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" containerID="33adac9c2cbef41f0756bc8f865026136409054b83fadd53ebea6a6315d4fc67" exitCode=0 Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.560598 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2rbv" event={"ID":"a48498fe-4ab1-428b-bf97-d8c7fe2d002a","Type":"ContainerDied","Data":"33adac9c2cbef41f0756bc8f865026136409054b83fadd53ebea6a6315d4fc67"} Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.560659 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d2rbv" event={"ID":"a48498fe-4ab1-428b-bf97-d8c7fe2d002a","Type":"ContainerDied","Data":"d7212972bf7245c6968dff754d383861707879afff40551a0be0f53ac8c38886"} Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.560675 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7212972bf7245c6968dff754d383861707879afff40551a0be0f53ac8c38886" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.564836 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.565484 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.567076 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.567277 4781 generic.go:334] "Generic (PLEG): container finished" podID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" containerID="1e78d58c609646958ddea68029dd8ab53376d361c0b4f0e5ca17ae0ea62da878" exitCode=0 Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.567318 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhcz4" event={"ID":"2e09d1ea-0001-4821-9c36-20ec4618fcfc","Type":"ContainerDied","Data":"1e78d58c609646958ddea68029dd8ab53376d361c0b4f0e5ca17ae0ea62da878"} Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.568559 4781 generic.go:334] "Generic (PLEG): container finished" podID="959c7256-02a6-47bb-9b32-64387b359e95" containerID="505d31ce1521e6161c544539096a8dc286d1e53b29759340e78e0a58d236ed05" exitCode=0 Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.568577 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.568596 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7lss8" event={"ID":"959c7256-02a6-47bb-9b32-64387b359e95","Type":"ContainerDied","Data":"505d31ce1521e6161c544539096a8dc286d1e53b29759340e78e0a58d236ed05"} Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.568634 4781 scope.go:117] "RemoveContainer" containerID="505d31ce1521e6161c544539096a8dc286d1e53b29759340e78e0a58d236ed05" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.568846 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.607414 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.682760 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/959c7256-02a6-47bb-9b32-64387b359e95-marketplace-trusted-ca\") pod \"959c7256-02a6-47bb-9b32-64387b359e95\" (UID: \"959c7256-02a6-47bb-9b32-64387b359e95\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.682822 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/959c7256-02a6-47bb-9b32-64387b359e95-marketplace-operator-metrics\") pod \"959c7256-02a6-47bb-9b32-64387b359e95\" (UID: \"959c7256-02a6-47bb-9b32-64387b359e95\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.682847 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee128c5f-b1ca-4f73-b1db-d643edb27970-catalog-content\") pod \"ee128c5f-b1ca-4f73-b1db-d643edb27970\" (UID: \"ee128c5f-b1ca-4f73-b1db-d643edb27970\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.682864 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ww98\" (UniqueName: \"kubernetes.io/projected/959c7256-02a6-47bb-9b32-64387b359e95-kube-api-access-4ww98\") pod \"959c7256-02a6-47bb-9b32-64387b359e95\" (UID: \"959c7256-02a6-47bb-9b32-64387b359e95\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.682885 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee128c5f-b1ca-4f73-b1db-d643edb27970-utilities\") pod \"ee128c5f-b1ca-4f73-b1db-d643edb27970\" (UID: \"ee128c5f-b1ca-4f73-b1db-d643edb27970\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.682907 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcgb5\" (UniqueName: \"kubernetes.io/projected/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-kube-api-access-dcgb5\") pod \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\" (UID: \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.682951 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-catalog-content\") pod \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\" (UID: \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.682968 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7snz6\" (UniqueName: \"kubernetes.io/projected/2e09d1ea-0001-4821-9c36-20ec4618fcfc-kube-api-access-7snz6\") pod \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\" (UID: \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.682997 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-utilities\") pod \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\" (UID: \"a48498fe-4ab1-428b-bf97-d8c7fe2d002a\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.683010 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/924b676c-8556-4d07-bf4b-a3607f3780d1-utilities\") pod \"924b676c-8556-4d07-bf4b-a3607f3780d1\" (UID: \"924b676c-8556-4d07-bf4b-a3607f3780d1\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.683053 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/924b676c-8556-4d07-bf4b-a3607f3780d1-catalog-content\") pod \"924b676c-8556-4d07-bf4b-a3607f3780d1\" (UID: \"924b676c-8556-4d07-bf4b-a3607f3780d1\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.683082 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e09d1ea-0001-4821-9c36-20ec4618fcfc-catalog-content\") pod \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\" (UID: \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.683104 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxlvx\" (UniqueName: \"kubernetes.io/projected/ee128c5f-b1ca-4f73-b1db-d643edb27970-kube-api-access-qxlvx\") pod \"ee128c5f-b1ca-4f73-b1db-d643edb27970\" (UID: \"ee128c5f-b1ca-4f73-b1db-d643edb27970\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.683131 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7bj9\" (UniqueName: \"kubernetes.io/projected/924b676c-8556-4d07-bf4b-a3607f3780d1-kube-api-access-c7bj9\") pod \"924b676c-8556-4d07-bf4b-a3607f3780d1\" (UID: \"924b676c-8556-4d07-bf4b-a3607f3780d1\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.683159 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e09d1ea-0001-4821-9c36-20ec4618fcfc-utilities\") pod \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\" (UID: \"2e09d1ea-0001-4821-9c36-20ec4618fcfc\") " Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.684536 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/959c7256-02a6-47bb-9b32-64387b359e95-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "959c7256-02a6-47bb-9b32-64387b359e95" (UID: "959c7256-02a6-47bb-9b32-64387b359e95"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.684553 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e09d1ea-0001-4821-9c36-20ec4618fcfc-utilities" (OuterVolumeSpecName: "utilities") pod "2e09d1ea-0001-4821-9c36-20ec4618fcfc" (UID: "2e09d1ea-0001-4821-9c36-20ec4618fcfc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.684967 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-utilities" (OuterVolumeSpecName: "utilities") pod "a48498fe-4ab1-428b-bf97-d8c7fe2d002a" (UID: "a48498fe-4ab1-428b-bf97-d8c7fe2d002a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.685113 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee128c5f-b1ca-4f73-b1db-d643edb27970-utilities" (OuterVolumeSpecName: "utilities") pod "ee128c5f-b1ca-4f73-b1db-d643edb27970" (UID: "ee128c5f-b1ca-4f73-b1db-d643edb27970"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.685411 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/924b676c-8556-4d07-bf4b-a3607f3780d1-utilities" (OuterVolumeSpecName: "utilities") pod "924b676c-8556-4d07-bf4b-a3607f3780d1" (UID: "924b676c-8556-4d07-bf4b-a3607f3780d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.689072 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/959c7256-02a6-47bb-9b32-64387b359e95-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "959c7256-02a6-47bb-9b32-64387b359e95" (UID: "959c7256-02a6-47bb-9b32-64387b359e95"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.689096 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e09d1ea-0001-4821-9c36-20ec4618fcfc-kube-api-access-7snz6" (OuterVolumeSpecName: "kube-api-access-7snz6") pod "2e09d1ea-0001-4821-9c36-20ec4618fcfc" (UID: "2e09d1ea-0001-4821-9c36-20ec4618fcfc"). InnerVolumeSpecName "kube-api-access-7snz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.689419 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee128c5f-b1ca-4f73-b1db-d643edb27970-kube-api-access-qxlvx" (OuterVolumeSpecName: "kube-api-access-qxlvx") pod "ee128c5f-b1ca-4f73-b1db-d643edb27970" (UID: "ee128c5f-b1ca-4f73-b1db-d643edb27970"). InnerVolumeSpecName "kube-api-access-qxlvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.690024 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/924b676c-8556-4d07-bf4b-a3607f3780d1-kube-api-access-c7bj9" (OuterVolumeSpecName: "kube-api-access-c7bj9") pod "924b676c-8556-4d07-bf4b-a3607f3780d1" (UID: "924b676c-8556-4d07-bf4b-a3607f3780d1"). InnerVolumeSpecName "kube-api-access-c7bj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.693646 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/959c7256-02a6-47bb-9b32-64387b359e95-kube-api-access-4ww98" (OuterVolumeSpecName: "kube-api-access-4ww98") pod "959c7256-02a6-47bb-9b32-64387b359e95" (UID: "959c7256-02a6-47bb-9b32-64387b359e95"). InnerVolumeSpecName "kube-api-access-4ww98". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.705287 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-kube-api-access-dcgb5" (OuterVolumeSpecName: "kube-api-access-dcgb5") pod "a48498fe-4ab1-428b-bf97-d8c7fe2d002a" (UID: "a48498fe-4ab1-428b-bf97-d8c7fe2d002a"). InnerVolumeSpecName "kube-api-access-dcgb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.719903 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee128c5f-b1ca-4f73-b1db-d643edb27970-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee128c5f-b1ca-4f73-b1db-d643edb27970" (UID: "ee128c5f-b1ca-4f73-b1db-d643edb27970"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.743313 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/924b676c-8556-4d07-bf4b-a3607f3780d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "924b676c-8556-4d07-bf4b-a3607f3780d1" (UID: "924b676c-8556-4d07-bf4b-a3607f3780d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.782378 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w8dwv"] Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.784576 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxlvx\" (UniqueName: \"kubernetes.io/projected/ee128c5f-b1ca-4f73-b1db-d643edb27970-kube-api-access-qxlvx\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.784598 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7bj9\" (UniqueName: \"kubernetes.io/projected/924b676c-8556-4d07-bf4b-a3607f3780d1-kube-api-access-c7bj9\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.784610 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e09d1ea-0001-4821-9c36-20ec4618fcfc-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.784621 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/959c7256-02a6-47bb-9b32-64387b359e95-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.784632 4781 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/959c7256-02a6-47bb-9b32-64387b359e95-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.784644 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee128c5f-b1ca-4f73-b1db-d643edb27970-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.784654 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ww98\" (UniqueName: \"kubernetes.io/projected/959c7256-02a6-47bb-9b32-64387b359e95-kube-api-access-4ww98\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.784664 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee128c5f-b1ca-4f73-b1db-d643edb27970-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.784675 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcgb5\" (UniqueName: \"kubernetes.io/projected/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-kube-api-access-dcgb5\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.784685 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7snz6\" (UniqueName: \"kubernetes.io/projected/2e09d1ea-0001-4821-9c36-20ec4618fcfc-kube-api-access-7snz6\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.784695 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/924b676c-8556-4d07-bf4b-a3607f3780d1-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.784705 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.784715 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/924b676c-8556-4d07-bf4b-a3607f3780d1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.791661 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a48498fe-4ab1-428b-bf97-d8c7fe2d002a" (UID: "a48498fe-4ab1-428b-bf97-d8c7fe2d002a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.827505 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e09d1ea-0001-4821-9c36-20ec4618fcfc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e09d1ea-0001-4821-9c36-20ec4618fcfc" (UID: "2e09d1ea-0001-4821-9c36-20ec4618fcfc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.885442 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e09d1ea-0001-4821-9c36-20ec4618fcfc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.885467 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a48498fe-4ab1-428b-bf97-d8c7fe2d002a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.899255 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7lss8"] Dec 08 20:09:29 crc kubenswrapper[4781]: I1208 20:09:29.899305 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7lss8"] Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.132329 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="959c7256-02a6-47bb-9b32-64387b359e95" path="/var/lib/kubelet/pods/959c7256-02a6-47bb-9b32-64387b359e95/volumes" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.578307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" event={"ID":"81d63467-e009-4c3d-8391-9f034f2da751","Type":"ContainerStarted","Data":"6e9683b799f8510518f125d26c5ed63924bd982a8c44120db12f10f70480f764"} Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.578707 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" event={"ID":"81d63467-e009-4c3d-8391-9f034f2da751","Type":"ContainerStarted","Data":"ef26a962a745b5ff7075292027069858fc815886426b8109f2588f93a69d56a0"} Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.580405 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.585206 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.586447 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vbq9n" event={"ID":"ee128c5f-b1ca-4f73-b1db-d643edb27970","Type":"ContainerDied","Data":"48488c2255eb50992d214196850e915e8071d387b8b632e814afd048374e13d1"} Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.586510 4781 scope.go:117] "RemoveContainer" containerID="99f98c15bf28875ae7ed9c178bcaca9e61865d34e8155a76b3e1c6298bd380a0" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.586694 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vbq9n" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.593272 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhcz4" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.594137 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhcz4" event={"ID":"2e09d1ea-0001-4821-9c36-20ec4618fcfc","Type":"ContainerDied","Data":"92da0ce16ba55207085643d4a1aae953b1ebece6b30b7a23e61228de2191543f"} Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.598158 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-w8dwv" podStartSLOduration=1.598107449 podStartE2EDuration="1.598107449s" podCreationTimestamp="2025-12-08 20:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:09:30.59781607 +0000 UTC m=+286.749099477" watchObservedRunningTime="2025-12-08 20:09:30.598107449 +0000 UTC m=+286.749390896" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.605901 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d2rbv" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.606350 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ghptn" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.607092 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ghptn" event={"ID":"924b676c-8556-4d07-bf4b-a3607f3780d1","Type":"ContainerDied","Data":"2b47b56dca279bb707d175260eb846c6b913fa75bab8ac5530a3ae962ba3df84"} Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.617727 4781 scope.go:117] "RemoveContainer" containerID="ee7739dad9f20db62ab4ee24ec58cd22e1d5a6f133142f4b84e2a3512396f7af" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.621265 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vbq9n"] Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.627802 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vbq9n"] Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.639072 4781 scope.go:117] "RemoveContainer" containerID="c6c7a1962abecdf98f55f0b6fa87f1e13732977ccd5de7fc8f3f2f592b2b6a95" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.660887 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lhcz4"] Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.665280 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lhcz4"] Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.673727 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ghptn"] Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.676225 4781 scope.go:117] "RemoveContainer" containerID="1e78d58c609646958ddea68029dd8ab53376d361c0b4f0e5ca17ae0ea62da878" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.686907 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ghptn"] Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.690113 4781 scope.go:117] "RemoveContainer" containerID="119c7df5bb5c06b6be314fb304ceb1399c11c07a5a3195ef881edc0ecb53e792" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.691449 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d2rbv"] Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.693934 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d2rbv"] Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.703595 4781 scope.go:117] "RemoveContainer" containerID="43250e1ee58029e075f3ad79489740149d5d7ed7de343e5f755b8ae783b352ad" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.719985 4781 scope.go:117] "RemoveContainer" containerID="35afeb83e9568984eaca0bbb6cdfc93d2ba638e677159253be409ecc7ac52ea8" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.736127 4781 scope.go:117] "RemoveContainer" containerID="1cf51ed6fd514d78afd6c6f6534e005a1a00e8bf72b53cc2f2fbd76b7384d528" Dec 08 20:09:30 crc kubenswrapper[4781]: I1208 20:09:30.749959 4781 scope.go:117] "RemoveContainer" containerID="61a4200ee30d459bcaa5024c2254b17dd6fe9238240ee4c60e4e0fef9540a053" Dec 08 20:09:32 crc kubenswrapper[4781]: I1208 20:09:32.132082 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" path="/var/lib/kubelet/pods/2e09d1ea-0001-4821-9c36-20ec4618fcfc/volumes" Dec 08 20:09:32 crc kubenswrapper[4781]: I1208 20:09:32.132656 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="924b676c-8556-4d07-bf4b-a3607f3780d1" path="/var/lib/kubelet/pods/924b676c-8556-4d07-bf4b-a3607f3780d1/volumes" Dec 08 20:09:32 crc kubenswrapper[4781]: I1208 20:09:32.133230 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" path="/var/lib/kubelet/pods/a48498fe-4ab1-428b-bf97-d8c7fe2d002a/volumes" Dec 08 20:09:32 crc kubenswrapper[4781]: I1208 20:09:32.134410 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee128c5f-b1ca-4f73-b1db-d643edb27970" path="/var/lib/kubelet/pods/ee128c5f-b1ca-4f73-b1db-d643edb27970/volumes" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.380220 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.380703 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.465717 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.465763 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.465803 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.465833 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.465852 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.465911 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.465953 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.465959 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.466068 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.466249 4781 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.466266 4781 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.466276 4781 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.466284 4781 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.475948 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.567056 4781 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.639544 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.639626 4781 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="6e509e786bc1e922df0b3fec826a75ca8f79fc237b3145c306bb3e8d6d06757d" exitCode=137 Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.639702 4781 scope.go:117] "RemoveContainer" containerID="6e509e786bc1e922df0b3fec826a75ca8f79fc237b3145c306bb3e8d6d06757d" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.639948 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.656521 4781 scope.go:117] "RemoveContainer" containerID="6e509e786bc1e922df0b3fec826a75ca8f79fc237b3145c306bb3e8d6d06757d" Dec 08 20:09:34 crc kubenswrapper[4781]: E1208 20:09:34.657441 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e509e786bc1e922df0b3fec826a75ca8f79fc237b3145c306bb3e8d6d06757d\": container with ID starting with 6e509e786bc1e922df0b3fec826a75ca8f79fc237b3145c306bb3e8d6d06757d not found: ID does not exist" containerID="6e509e786bc1e922df0b3fec826a75ca8f79fc237b3145c306bb3e8d6d06757d" Dec 08 20:09:34 crc kubenswrapper[4781]: I1208 20:09:34.657477 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e509e786bc1e922df0b3fec826a75ca8f79fc237b3145c306bb3e8d6d06757d"} err="failed to get container status \"6e509e786bc1e922df0b3fec826a75ca8f79fc237b3145c306bb3e8d6d06757d\": rpc error: code = NotFound desc = could not find container \"6e509e786bc1e922df0b3fec826a75ca8f79fc237b3145c306bb3e8d6d06757d\": container with ID starting with 6e509e786bc1e922df0b3fec826a75ca8f79fc237b3145c306bb3e8d6d06757d not found: ID does not exist" Dec 08 20:09:36 crc kubenswrapper[4781]: I1208 20:09:36.131423 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 08 20:09:41 crc kubenswrapper[4781]: I1208 20:09:41.498763 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 08 20:09:51 crc kubenswrapper[4781]: I1208 20:09:51.303469 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 08 20:09:51 crc kubenswrapper[4781]: I1208 20:09:51.455829 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.392950 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hjncp"] Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.393682 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" podUID="8db57361-77ab-43d3-acc5-d4de29c8f13e" containerName="controller-manager" containerID="cri-o://16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334" gracePeriod=30 Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.488887 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm"] Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.489122 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" podUID="b05dbba1-cec3-4a66-aa31-7362bb50ed2f" containerName="route-controller-manager" containerID="cri-o://412ce4eda377a47b0618b24a4eec4f3295882c420959d4b3e943ca6245b1b99b" gracePeriod=30 Dec 08 20:09:57 crc kubenswrapper[4781]: E1208 20:09:57.495856 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8db57361_77ab_43d3_acc5_d4de29c8f13e.slice/crio-conmon-16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8db57361_77ab_43d3_acc5_d4de29c8f13e.slice/crio-16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334.scope\": RecentStats: unable to find data in memory cache]" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.733699 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.759024 4781 generic.go:334] "Generic (PLEG): container finished" podID="b05dbba1-cec3-4a66-aa31-7362bb50ed2f" containerID="412ce4eda377a47b0618b24a4eec4f3295882c420959d4b3e943ca6245b1b99b" exitCode=0 Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.759098 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" event={"ID":"b05dbba1-cec3-4a66-aa31-7362bb50ed2f","Type":"ContainerDied","Data":"412ce4eda377a47b0618b24a4eec4f3295882c420959d4b3e943ca6245b1b99b"} Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.760076 4781 generic.go:334] "Generic (PLEG): container finished" podID="8db57361-77ab-43d3-acc5-d4de29c8f13e" containerID="16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334" exitCode=0 Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.760123 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" event={"ID":"8db57361-77ab-43d3-acc5-d4de29c8f13e","Type":"ContainerDied","Data":"16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334"} Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.760146 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" event={"ID":"8db57361-77ab-43d3-acc5-d4de29c8f13e","Type":"ContainerDied","Data":"946953d16bfbd9399cf82ee863a5c112b7656c5bbd59dd10bad50a1d5a58f78c"} Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.760162 4781 scope.go:117] "RemoveContainer" containerID="16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.760215 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hjncp" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.789737 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.792808 4781 scope.go:117] "RemoveContainer" containerID="16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334" Dec 08 20:09:57 crc kubenswrapper[4781]: E1208 20:09:57.793162 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334\": container with ID starting with 16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334 not found: ID does not exist" containerID="16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.793208 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334"} err="failed to get container status \"16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334\": rpc error: code = NotFound desc = could not find container \"16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334\": container with ID starting with 16ea747a5d3f2f5f79993d9dfef8c011aaf3e67fdeff8bc2402d9a950468f334 not found: ID does not exist" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.869669 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-client-ca\") pod \"8db57361-77ab-43d3-acc5-d4de29c8f13e\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.869750 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-config\") pod \"8db57361-77ab-43d3-acc5-d4de29c8f13e\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.869797 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-proxy-ca-bundles\") pod \"8db57361-77ab-43d3-acc5-d4de29c8f13e\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.869852 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv2hr\" (UniqueName: \"kubernetes.io/projected/8db57361-77ab-43d3-acc5-d4de29c8f13e-kube-api-access-hv2hr\") pod \"8db57361-77ab-43d3-acc5-d4de29c8f13e\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.869884 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8db57361-77ab-43d3-acc5-d4de29c8f13e-serving-cert\") pod \"8db57361-77ab-43d3-acc5-d4de29c8f13e\" (UID: \"8db57361-77ab-43d3-acc5-d4de29c8f13e\") " Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.870541 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-client-ca" (OuterVolumeSpecName: "client-ca") pod "8db57361-77ab-43d3-acc5-d4de29c8f13e" (UID: "8db57361-77ab-43d3-acc5-d4de29c8f13e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.871237 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8db57361-77ab-43d3-acc5-d4de29c8f13e" (UID: "8db57361-77ab-43d3-acc5-d4de29c8f13e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.871732 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-config" (OuterVolumeSpecName: "config") pod "8db57361-77ab-43d3-acc5-d4de29c8f13e" (UID: "8db57361-77ab-43d3-acc5-d4de29c8f13e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.874993 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db57361-77ab-43d3-acc5-d4de29c8f13e-kube-api-access-hv2hr" (OuterVolumeSpecName: "kube-api-access-hv2hr") pod "8db57361-77ab-43d3-acc5-d4de29c8f13e" (UID: "8db57361-77ab-43d3-acc5-d4de29c8f13e"). InnerVolumeSpecName "kube-api-access-hv2hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.875218 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db57361-77ab-43d3-acc5-d4de29c8f13e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8db57361-77ab-43d3-acc5-d4de29c8f13e" (UID: "8db57361-77ab-43d3-acc5-d4de29c8f13e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.970809 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-serving-cert\") pod \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.970902 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wgqw\" (UniqueName: \"kubernetes.io/projected/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-kube-api-access-6wgqw\") pod \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.970981 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-config\") pod \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.971030 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-client-ca\") pod \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\" (UID: \"b05dbba1-cec3-4a66-aa31-7362bb50ed2f\") " Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.971384 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.971409 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.971426 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8db57361-77ab-43d3-acc5-d4de29c8f13e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.971445 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv2hr\" (UniqueName: \"kubernetes.io/projected/8db57361-77ab-43d3-acc5-d4de29c8f13e-kube-api-access-hv2hr\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.971462 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8db57361-77ab-43d3-acc5-d4de29c8f13e-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.972135 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-client-ca" (OuterVolumeSpecName: "client-ca") pod "b05dbba1-cec3-4a66-aa31-7362bb50ed2f" (UID: "b05dbba1-cec3-4a66-aa31-7362bb50ed2f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.972866 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-config" (OuterVolumeSpecName: "config") pod "b05dbba1-cec3-4a66-aa31-7362bb50ed2f" (UID: "b05dbba1-cec3-4a66-aa31-7362bb50ed2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.978314 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-kube-api-access-6wgqw" (OuterVolumeSpecName: "kube-api-access-6wgqw") pod "b05dbba1-cec3-4a66-aa31-7362bb50ed2f" (UID: "b05dbba1-cec3-4a66-aa31-7362bb50ed2f"). InnerVolumeSpecName "kube-api-access-6wgqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:09:57 crc kubenswrapper[4781]: I1208 20:09:57.982316 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b05dbba1-cec3-4a66-aa31-7362bb50ed2f" (UID: "b05dbba1-cec3-4a66-aa31-7362bb50ed2f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.072073 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.072417 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wgqw\" (UniqueName: \"kubernetes.io/projected/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-kube-api-access-6wgqw\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.072434 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.072446 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b05dbba1-cec3-4a66-aa31-7362bb50ed2f-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.086855 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hjncp"] Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.090326 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hjncp"] Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.136269 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db57361-77ab-43d3-acc5-d4de29c8f13e" path="/var/lib/kubelet/pods/8db57361-77ab-43d3-acc5-d4de29c8f13e/volumes" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.550775 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74487f4477-klljw"] Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551048 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" containerName="extract-utilities" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551064 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" containerName="extract-utilities" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551076 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee128c5f-b1ca-4f73-b1db-d643edb27970" containerName="registry-server" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551086 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee128c5f-b1ca-4f73-b1db-d643edb27970" containerName="registry-server" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551100 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee128c5f-b1ca-4f73-b1db-d643edb27970" containerName="extract-utilities" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551108 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee128c5f-b1ca-4f73-b1db-d643edb27970" containerName="extract-utilities" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551117 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="924b676c-8556-4d07-bf4b-a3607f3780d1" containerName="registry-server" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551124 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="924b676c-8556-4d07-bf4b-a3607f3780d1" containerName="registry-server" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551136 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959c7256-02a6-47bb-9b32-64387b359e95" containerName="marketplace-operator" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551144 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="959c7256-02a6-47bb-9b32-64387b359e95" containerName="marketplace-operator" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551151 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee128c5f-b1ca-4f73-b1db-d643edb27970" containerName="extract-content" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551158 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee128c5f-b1ca-4f73-b1db-d643edb27970" containerName="extract-content" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551167 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b05dbba1-cec3-4a66-aa31-7362bb50ed2f" containerName="route-controller-manager" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551174 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b05dbba1-cec3-4a66-aa31-7362bb50ed2f" containerName="route-controller-manager" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551184 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" containerName="extract-content" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551191 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" containerName="extract-content" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551205 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" containerName="registry-server" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551213 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" containerName="registry-server" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551226 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" containerName="extract-content" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551234 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" containerName="extract-content" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551242 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="924b676c-8556-4d07-bf4b-a3607f3780d1" containerName="extract-content" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551249 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="924b676c-8556-4d07-bf4b-a3607f3780d1" containerName="extract-content" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551258 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" containerName="extract-utilities" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551267 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" containerName="extract-utilities" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551277 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db57361-77ab-43d3-acc5-d4de29c8f13e" containerName="controller-manager" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551284 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db57361-77ab-43d3-acc5-d4de29c8f13e" containerName="controller-manager" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551296 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="924b676c-8556-4d07-bf4b-a3607f3780d1" containerName="extract-utilities" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551303 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="924b676c-8556-4d07-bf4b-a3607f3780d1" containerName="extract-utilities" Dec 08 20:09:58 crc kubenswrapper[4781]: E1208 20:09:58.551315 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" containerName="registry-server" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551322 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" containerName="registry-server" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551431 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="924b676c-8556-4d07-bf4b-a3607f3780d1" containerName="registry-server" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551449 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db57361-77ab-43d3-acc5-d4de29c8f13e" containerName="controller-manager" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551459 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e09d1ea-0001-4821-9c36-20ec4618fcfc" containerName="registry-server" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551469 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b05dbba1-cec3-4a66-aa31-7362bb50ed2f" containerName="route-controller-manager" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551481 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a48498fe-4ab1-428b-bf97-d8c7fe2d002a" containerName="registry-server" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551491 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee128c5f-b1ca-4f73-b1db-d643edb27970" containerName="registry-server" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.551503 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="959c7256-02a6-47bb-9b32-64387b359e95" containerName="marketplace-operator" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.552118 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.553782 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.554508 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.555195 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd"] Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.555271 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.555715 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.555844 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.556212 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.556985 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.563249 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.563588 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74487f4477-klljw"] Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.567671 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd"] Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.681469 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-config\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.681536 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-client-ca\") pod \"route-controller-manager-5596fc856c-6ttsd\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.681567 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-config\") pod \"route-controller-manager-5596fc856c-6ttsd\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.681595 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-serving-cert\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.681614 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-client-ca\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.681631 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9vmt\" (UniqueName: \"kubernetes.io/projected/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-kube-api-access-j9vmt\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.681658 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg8dw\" (UniqueName: \"kubernetes.io/projected/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-kube-api-access-lg8dw\") pod \"route-controller-manager-5596fc856c-6ttsd\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.681783 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-proxy-ca-bundles\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.681895 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-serving-cert\") pod \"route-controller-manager-5596fc856c-6ttsd\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.777952 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" event={"ID":"b05dbba1-cec3-4a66-aa31-7362bb50ed2f","Type":"ContainerDied","Data":"bcb503b909f9e11263680a63f238fa462d4e9bae5f8d47318680399a45ea463f"} Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.778026 4781 scope.go:117] "RemoveContainer" containerID="412ce4eda377a47b0618b24a4eec4f3295882c420959d4b3e943ca6245b1b99b" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.778136 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.782675 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-client-ca\") pod \"route-controller-manager-5596fc856c-6ttsd\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.782738 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-config\") pod \"route-controller-manager-5596fc856c-6ttsd\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.782763 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-serving-cert\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.782791 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-client-ca\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.782813 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9vmt\" (UniqueName: \"kubernetes.io/projected/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-kube-api-access-j9vmt\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.782844 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg8dw\" (UniqueName: \"kubernetes.io/projected/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-kube-api-access-lg8dw\") pod \"route-controller-manager-5596fc856c-6ttsd\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.782869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-proxy-ca-bundles\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.782899 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-serving-cert\") pod \"route-controller-manager-5596fc856c-6ttsd\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.782970 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-config\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.783524 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-client-ca\") pod \"route-controller-manager-5596fc856c-6ttsd\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.784520 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-config\") pod \"route-controller-manager-5596fc856c-6ttsd\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.784603 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-config\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.785501 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-client-ca\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.786169 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-proxy-ca-bundles\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.789694 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-serving-cert\") pod \"route-controller-manager-5596fc856c-6ttsd\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.799856 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9vmt\" (UniqueName: \"kubernetes.io/projected/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-kube-api-access-j9vmt\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.804822 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm"] Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.808090 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg8dw\" (UniqueName: \"kubernetes.io/projected/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-kube-api-access-lg8dw\") pod \"route-controller-manager-5596fc856c-6ttsd\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.809439 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-tqpvm"] Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.810062 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-serving-cert\") pod \"controller-manager-74487f4477-klljw\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.880513 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:58 crc kubenswrapper[4781]: I1208 20:09:58.887958 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:59 crc kubenswrapper[4781]: I1208 20:09:59.070832 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74487f4477-klljw"] Dec 08 20:09:59 crc kubenswrapper[4781]: I1208 20:09:59.114351 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd"] Dec 08 20:09:59 crc kubenswrapper[4781]: W1208 20:09:59.121127 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dbab6f9_ccca_4641_a17d_38bfde8e65e7.slice/crio-9c6310081cbfb2c146b603b698fa089292c223d4d31e5a5cc0d2b8257708b5a3 WatchSource:0}: Error finding container 9c6310081cbfb2c146b603b698fa089292c223d4d31e5a5cc0d2b8257708b5a3: Status 404 returned error can't find the container with id 9c6310081cbfb2c146b603b698fa089292c223d4d31e5a5cc0d2b8257708b5a3 Dec 08 20:09:59 crc kubenswrapper[4781]: I1208 20:09:59.788774 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74487f4477-klljw" event={"ID":"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce","Type":"ContainerStarted","Data":"22fd8d37e1f06a0d19d9cf7425a72d5c1a4d28449d72c9318b6dcfee6718c5c3"} Dec 08 20:09:59 crc kubenswrapper[4781]: I1208 20:09:59.788819 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74487f4477-klljw" event={"ID":"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce","Type":"ContainerStarted","Data":"34f1f704c4f0bf6809f7fe5a860a53cde448d0660217c3a500b760496e0ecd25"} Dec 08 20:09:59 crc kubenswrapper[4781]: I1208 20:09:59.788953 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:59 crc kubenswrapper[4781]: I1208 20:09:59.789952 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" event={"ID":"3dbab6f9-ccca-4641-a17d-38bfde8e65e7","Type":"ContainerStarted","Data":"fd10f094559908b85c5acb2085f1d15d01d588943fd38b6c96413b5ad4dbbc2a"} Dec 08 20:09:59 crc kubenswrapper[4781]: I1208 20:09:59.789983 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" event={"ID":"3dbab6f9-ccca-4641-a17d-38bfde8e65e7","Type":"ContainerStarted","Data":"9c6310081cbfb2c146b603b698fa089292c223d4d31e5a5cc0d2b8257708b5a3"} Dec 08 20:09:59 crc kubenswrapper[4781]: I1208 20:09:59.790517 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:59 crc kubenswrapper[4781]: I1208 20:09:59.797734 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:09:59 crc kubenswrapper[4781]: I1208 20:09:59.799187 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:09:59 crc kubenswrapper[4781]: I1208 20:09:59.806328 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74487f4477-klljw" podStartSLOduration=2.806307129 podStartE2EDuration="2.806307129s" podCreationTimestamp="2025-12-08 20:09:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:09:59.805537817 +0000 UTC m=+315.956821194" watchObservedRunningTime="2025-12-08 20:09:59.806307129 +0000 UTC m=+315.957590506" Dec 08 20:09:59 crc kubenswrapper[4781]: I1208 20:09:59.852147 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" podStartSLOduration=2.852107149 podStartE2EDuration="2.852107149s" podCreationTimestamp="2025-12-08 20:09:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:09:59.828785927 +0000 UTC m=+315.980069304" watchObservedRunningTime="2025-12-08 20:09:59.852107149 +0000 UTC m=+316.003390526" Dec 08 20:10:00 crc kubenswrapper[4781]: I1208 20:10:00.131665 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b05dbba1-cec3-4a66-aa31-7362bb50ed2f" path="/var/lib/kubelet/pods/b05dbba1-cec3-4a66-aa31-7362bb50ed2f/volumes" Dec 08 20:10:05 crc kubenswrapper[4781]: I1208 20:10:05.935944 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74487f4477-klljw"] Dec 08 20:10:05 crc kubenswrapper[4781]: I1208 20:10:05.936160 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74487f4477-klljw" podUID="9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce" containerName="controller-manager" containerID="cri-o://22fd8d37e1f06a0d19d9cf7425a72d5c1a4d28449d72c9318b6dcfee6718c5c3" gracePeriod=30 Dec 08 20:10:05 crc kubenswrapper[4781]: I1208 20:10:05.951103 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd"] Dec 08 20:10:05 crc kubenswrapper[4781]: I1208 20:10:05.951330 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" podUID="3dbab6f9-ccca-4641-a17d-38bfde8e65e7" containerName="route-controller-manager" containerID="cri-o://fd10f094559908b85c5acb2085f1d15d01d588943fd38b6c96413b5ad4dbbc2a" gracePeriod=30 Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.472494 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.504079 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.583042 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-client-ca\") pod \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.583120 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-config\") pod \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.583172 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg8dw\" (UniqueName: \"kubernetes.io/projected/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-kube-api-access-lg8dw\") pod \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.583306 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9vmt\" (UniqueName: \"kubernetes.io/projected/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-kube-api-access-j9vmt\") pod \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.583329 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-proxy-ca-bundles\") pod \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.583347 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-serving-cert\") pod \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\" (UID: \"3dbab6f9-ccca-4641-a17d-38bfde8e65e7\") " Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.584243 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-client-ca" (OuterVolumeSpecName: "client-ca") pod "3dbab6f9-ccca-4641-a17d-38bfde8e65e7" (UID: "3dbab6f9-ccca-4641-a17d-38bfde8e65e7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.584274 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce" (UID: "9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.584410 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-config" (OuterVolumeSpecName: "config") pod "3dbab6f9-ccca-4641-a17d-38bfde8e65e7" (UID: "3dbab6f9-ccca-4641-a17d-38bfde8e65e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.589123 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3dbab6f9-ccca-4641-a17d-38bfde8e65e7" (UID: "3dbab6f9-ccca-4641-a17d-38bfde8e65e7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.589131 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-kube-api-access-lg8dw" (OuterVolumeSpecName: "kube-api-access-lg8dw") pod "3dbab6f9-ccca-4641-a17d-38bfde8e65e7" (UID: "3dbab6f9-ccca-4641-a17d-38bfde8e65e7"). InnerVolumeSpecName "kube-api-access-lg8dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.590379 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-kube-api-access-j9vmt" (OuterVolumeSpecName: "kube-api-access-j9vmt") pod "9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce" (UID: "9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce"). InnerVolumeSpecName "kube-api-access-j9vmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.684591 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-serving-cert\") pod \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.684645 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-config\") pod \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.684725 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-client-ca\") pod \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\" (UID: \"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce\") " Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.685090 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg8dw\" (UniqueName: \"kubernetes.io/projected/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-kube-api-access-lg8dw\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.685121 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9vmt\" (UniqueName: \"kubernetes.io/projected/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-kube-api-access-j9vmt\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.685176 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.685192 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.685207 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.685223 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dbab6f9-ccca-4641-a17d-38bfde8e65e7-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.685309 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-client-ca" (OuterVolumeSpecName: "client-ca") pod "9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce" (UID: "9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.685546 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-config" (OuterVolumeSpecName: "config") pod "9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce" (UID: "9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.688207 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce" (UID: "9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.786390 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.786445 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.786463 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.832081 4781 generic.go:334] "Generic (PLEG): container finished" podID="9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce" containerID="22fd8d37e1f06a0d19d9cf7425a72d5c1a4d28449d72c9318b6dcfee6718c5c3" exitCode=0 Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.832119 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74487f4477-klljw" event={"ID":"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce","Type":"ContainerDied","Data":"22fd8d37e1f06a0d19d9cf7425a72d5c1a4d28449d72c9318b6dcfee6718c5c3"} Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.832141 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74487f4477-klljw" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.832162 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74487f4477-klljw" event={"ID":"9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce","Type":"ContainerDied","Data":"34f1f704c4f0bf6809f7fe5a860a53cde448d0660217c3a500b760496e0ecd25"} Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.832183 4781 scope.go:117] "RemoveContainer" containerID="22fd8d37e1f06a0d19d9cf7425a72d5c1a4d28449d72c9318b6dcfee6718c5c3" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.834840 4781 generic.go:334] "Generic (PLEG): container finished" podID="3dbab6f9-ccca-4641-a17d-38bfde8e65e7" containerID="fd10f094559908b85c5acb2085f1d15d01d588943fd38b6c96413b5ad4dbbc2a" exitCode=0 Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.834872 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" event={"ID":"3dbab6f9-ccca-4641-a17d-38bfde8e65e7","Type":"ContainerDied","Data":"fd10f094559908b85c5acb2085f1d15d01d588943fd38b6c96413b5ad4dbbc2a"} Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.834892 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" event={"ID":"3dbab6f9-ccca-4641-a17d-38bfde8e65e7","Type":"ContainerDied","Data":"9c6310081cbfb2c146b603b698fa089292c223d4d31e5a5cc0d2b8257708b5a3"} Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.835257 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.853385 4781 scope.go:117] "RemoveContainer" containerID="22fd8d37e1f06a0d19d9cf7425a72d5c1a4d28449d72c9318b6dcfee6718c5c3" Dec 08 20:10:06 crc kubenswrapper[4781]: E1208 20:10:06.854378 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22fd8d37e1f06a0d19d9cf7425a72d5c1a4d28449d72c9318b6dcfee6718c5c3\": container with ID starting with 22fd8d37e1f06a0d19d9cf7425a72d5c1a4d28449d72c9318b6dcfee6718c5c3 not found: ID does not exist" containerID="22fd8d37e1f06a0d19d9cf7425a72d5c1a4d28449d72c9318b6dcfee6718c5c3" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.854438 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22fd8d37e1f06a0d19d9cf7425a72d5c1a4d28449d72c9318b6dcfee6718c5c3"} err="failed to get container status \"22fd8d37e1f06a0d19d9cf7425a72d5c1a4d28449d72c9318b6dcfee6718c5c3\": rpc error: code = NotFound desc = could not find container \"22fd8d37e1f06a0d19d9cf7425a72d5c1a4d28449d72c9318b6dcfee6718c5c3\": container with ID starting with 22fd8d37e1f06a0d19d9cf7425a72d5c1a4d28449d72c9318b6dcfee6718c5c3 not found: ID does not exist" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.854474 4781 scope.go:117] "RemoveContainer" containerID="fd10f094559908b85c5acb2085f1d15d01d588943fd38b6c96413b5ad4dbbc2a" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.876326 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74487f4477-klljw"] Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.881255 4781 scope.go:117] "RemoveContainer" containerID="fd10f094559908b85c5acb2085f1d15d01d588943fd38b6c96413b5ad4dbbc2a" Dec 08 20:10:06 crc kubenswrapper[4781]: E1208 20:10:06.881787 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd10f094559908b85c5acb2085f1d15d01d588943fd38b6c96413b5ad4dbbc2a\": container with ID starting with fd10f094559908b85c5acb2085f1d15d01d588943fd38b6c96413b5ad4dbbc2a not found: ID does not exist" containerID="fd10f094559908b85c5acb2085f1d15d01d588943fd38b6c96413b5ad4dbbc2a" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.881851 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd10f094559908b85c5acb2085f1d15d01d588943fd38b6c96413b5ad4dbbc2a"} err="failed to get container status \"fd10f094559908b85c5acb2085f1d15d01d588943fd38b6c96413b5ad4dbbc2a\": rpc error: code = NotFound desc = could not find container \"fd10f094559908b85c5acb2085f1d15d01d588943fd38b6c96413b5ad4dbbc2a\": container with ID starting with fd10f094559908b85c5acb2085f1d15d01d588943fd38b6c96413b5ad4dbbc2a not found: ID does not exist" Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.887409 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74487f4477-klljw"] Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.893343 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd"] Dec 08 20:10:06 crc kubenswrapper[4781]: I1208 20:10:06.898520 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5596fc856c-6ttsd"] Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.556167 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94"] Dec 08 20:10:07 crc kubenswrapper[4781]: E1208 20:10:07.557965 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce" containerName="controller-manager" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.558072 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce" containerName="controller-manager" Dec 08 20:10:07 crc kubenswrapper[4781]: E1208 20:10:07.558179 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbab6f9-ccca-4641-a17d-38bfde8e65e7" containerName="route-controller-manager" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.558265 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbab6f9-ccca-4641-a17d-38bfde8e65e7" containerName="route-controller-manager" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.558452 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce" containerName="controller-manager" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.558558 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dbab6f9-ccca-4641-a17d-38bfde8e65e7" containerName="route-controller-manager" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.559096 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.559224 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c5c8764-4nv9l"] Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.560078 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.561848 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.565204 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.565485 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.565566 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.565792 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.565857 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.565794 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.565978 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.565998 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.566050 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.566297 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.566526 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.568720 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c5c8764-4nv9l"] Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.570485 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.583708 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94"] Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.598414 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-config\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.598468 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-client-ca\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.598512 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482605bb-e881-4901-b2c5-ff076f3de586-config\") pod \"route-controller-manager-6856fbf746-8rk94\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.598537 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g29rv\" (UniqueName: \"kubernetes.io/projected/482605bb-e881-4901-b2c5-ff076f3de586-kube-api-access-g29rv\") pod \"route-controller-manager-6856fbf746-8rk94\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.598602 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-proxy-ca-bundles\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.598699 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482605bb-e881-4901-b2c5-ff076f3de586-serving-cert\") pod \"route-controller-manager-6856fbf746-8rk94\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.598765 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482605bb-e881-4901-b2c5-ff076f3de586-client-ca\") pod \"route-controller-manager-6856fbf746-8rk94\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.598812 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc53af71-600e-4622-ba7e-d0048ae8c57b-serving-cert\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.598850 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjhp7\" (UniqueName: \"kubernetes.io/projected/dc53af71-600e-4622-ba7e-d0048ae8c57b-kube-api-access-kjhp7\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.699405 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-client-ca\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.699483 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482605bb-e881-4901-b2c5-ff076f3de586-config\") pod \"route-controller-manager-6856fbf746-8rk94\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.699507 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g29rv\" (UniqueName: \"kubernetes.io/projected/482605bb-e881-4901-b2c5-ff076f3de586-kube-api-access-g29rv\") pod \"route-controller-manager-6856fbf746-8rk94\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.700138 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-proxy-ca-bundles\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.700293 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482605bb-e881-4901-b2c5-ff076f3de586-serving-cert\") pod \"route-controller-manager-6856fbf746-8rk94\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.700409 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482605bb-e881-4901-b2c5-ff076f3de586-client-ca\") pod \"route-controller-manager-6856fbf746-8rk94\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.700519 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc53af71-600e-4622-ba7e-d0048ae8c57b-serving-cert\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.700628 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhp7\" (UniqueName: \"kubernetes.io/projected/dc53af71-600e-4622-ba7e-d0048ae8c57b-kube-api-access-kjhp7\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.700764 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-config\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.700899 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-client-ca\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.701127 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-proxy-ca-bundles\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.700710 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482605bb-e881-4901-b2c5-ff076f3de586-config\") pod \"route-controller-manager-6856fbf746-8rk94\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.701752 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482605bb-e881-4901-b2c5-ff076f3de586-client-ca\") pod \"route-controller-manager-6856fbf746-8rk94\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.702448 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-config\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.704715 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc53af71-600e-4622-ba7e-d0048ae8c57b-serving-cert\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.706467 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482605bb-e881-4901-b2c5-ff076f3de586-serving-cert\") pod \"route-controller-manager-6856fbf746-8rk94\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.718587 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhp7\" (UniqueName: \"kubernetes.io/projected/dc53af71-600e-4622-ba7e-d0048ae8c57b-kube-api-access-kjhp7\") pod \"controller-manager-6c5c8764-4nv9l\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.719103 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g29rv\" (UniqueName: \"kubernetes.io/projected/482605bb-e881-4901-b2c5-ff076f3de586-kube-api-access-g29rv\") pod \"route-controller-manager-6856fbf746-8rk94\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.880714 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:07 crc kubenswrapper[4781]: I1208 20:10:07.892058 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:08 crc kubenswrapper[4781]: I1208 20:10:08.137547 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dbab6f9-ccca-4641-a17d-38bfde8e65e7" path="/var/lib/kubelet/pods/3dbab6f9-ccca-4641-a17d-38bfde8e65e7/volumes" Dec 08 20:10:08 crc kubenswrapper[4781]: I1208 20:10:08.138433 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce" path="/var/lib/kubelet/pods/9f29ba5f-a51c-4b4f-aa6e-8b4be1a172ce/volumes" Dec 08 20:10:08 crc kubenswrapper[4781]: I1208 20:10:08.187014 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c5c8764-4nv9l"] Dec 08 20:10:08 crc kubenswrapper[4781]: I1208 20:10:08.345378 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94"] Dec 08 20:10:08 crc kubenswrapper[4781]: W1208 20:10:08.350798 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482605bb_e881_4901_b2c5_ff076f3de586.slice/crio-c73aa3f69b90315ac9c0e2aea990d19690319375838d2e572beebe88253da7b0 WatchSource:0}: Error finding container c73aa3f69b90315ac9c0e2aea990d19690319375838d2e572beebe88253da7b0: Status 404 returned error can't find the container with id c73aa3f69b90315ac9c0e2aea990d19690319375838d2e572beebe88253da7b0 Dec 08 20:10:08 crc kubenswrapper[4781]: I1208 20:10:08.846605 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" event={"ID":"482605bb-e881-4901-b2c5-ff076f3de586","Type":"ContainerStarted","Data":"e7f271ac07ec1cc2178270ee8eb7080b47c327028e157cb5a8f71c4c0dfda1b2"} Dec 08 20:10:08 crc kubenswrapper[4781]: I1208 20:10:08.846649 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" event={"ID":"482605bb-e881-4901-b2c5-ff076f3de586","Type":"ContainerStarted","Data":"c73aa3f69b90315ac9c0e2aea990d19690319375838d2e572beebe88253da7b0"} Dec 08 20:10:08 crc kubenswrapper[4781]: I1208 20:10:08.847697 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:08 crc kubenswrapper[4781]: I1208 20:10:08.848850 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" event={"ID":"dc53af71-600e-4622-ba7e-d0048ae8c57b","Type":"ContainerStarted","Data":"88c6a167865fe7d51770d35223e6dc428b7ce6d1a9d7b28ffa17db2dd63e6b41"} Dec 08 20:10:08 crc kubenswrapper[4781]: I1208 20:10:08.848932 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" event={"ID":"dc53af71-600e-4622-ba7e-d0048ae8c57b","Type":"ContainerStarted","Data":"650cdf14a2dae479cc9cb8f021235c13b8525a46410edb6788358514c5dce97e"} Dec 08 20:10:08 crc kubenswrapper[4781]: I1208 20:10:08.849169 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:08 crc kubenswrapper[4781]: I1208 20:10:08.853464 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:08 crc kubenswrapper[4781]: I1208 20:10:08.890710 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" podStartSLOduration=2.890694293 podStartE2EDuration="2.890694293s" podCreationTimestamp="2025-12-08 20:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:10:08.890516258 +0000 UTC m=+325.041799655" watchObservedRunningTime="2025-12-08 20:10:08.890694293 +0000 UTC m=+325.041977670" Dec 08 20:10:08 crc kubenswrapper[4781]: I1208 20:10:08.891313 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" podStartSLOduration=2.891307821 podStartE2EDuration="2.891307821s" podCreationTimestamp="2025-12-08 20:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:10:08.870737579 +0000 UTC m=+325.022020956" watchObservedRunningTime="2025-12-08 20:10:08.891307821 +0000 UTC m=+325.042591198" Dec 08 20:10:09 crc kubenswrapper[4781]: I1208 20:10:09.177053 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:23 crc kubenswrapper[4781]: I1208 20:10:23.969357 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-57nnf"] Dec 08 20:10:23 crc kubenswrapper[4781]: I1208 20:10:23.970823 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:23 crc kubenswrapper[4781]: I1208 20:10:23.972895 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 08 20:10:23 crc kubenswrapper[4781]: I1208 20:10:23.980800 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-57nnf"] Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.087127 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81-catalog-content\") pod \"certified-operators-57nnf\" (UID: \"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81\") " pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.087222 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81-utilities\") pod \"certified-operators-57nnf\" (UID: \"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81\") " pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.087273 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44jqk\" (UniqueName: \"kubernetes.io/projected/70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81-kube-api-access-44jqk\") pod \"certified-operators-57nnf\" (UID: \"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81\") " pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.159999 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mh8gd"] Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.161156 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.164953 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.169406 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mh8gd"] Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.192930 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81-catalog-content\") pod \"certified-operators-57nnf\" (UID: \"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81\") " pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.192990 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81-utilities\") pod \"certified-operators-57nnf\" (UID: \"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81\") " pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.193027 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44jqk\" (UniqueName: \"kubernetes.io/projected/70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81-kube-api-access-44jqk\") pod \"certified-operators-57nnf\" (UID: \"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81\") " pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.193507 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81-catalog-content\") pod \"certified-operators-57nnf\" (UID: \"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81\") " pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.193574 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81-utilities\") pod \"certified-operators-57nnf\" (UID: \"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81\") " pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.211705 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44jqk\" (UniqueName: \"kubernetes.io/projected/70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81-kube-api-access-44jqk\") pod \"certified-operators-57nnf\" (UID: \"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81\") " pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.288487 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.293864 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdb95fae-ff11-445b-afc5-d1b040c9bff9-catalog-content\") pod \"community-operators-mh8gd\" (UID: \"cdb95fae-ff11-445b-afc5-d1b040c9bff9\") " pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.294095 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdb95fae-ff11-445b-afc5-d1b040c9bff9-utilities\") pod \"community-operators-mh8gd\" (UID: \"cdb95fae-ff11-445b-afc5-d1b040c9bff9\") " pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.294150 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmcqw\" (UniqueName: \"kubernetes.io/projected/cdb95fae-ff11-445b-afc5-d1b040c9bff9-kube-api-access-hmcqw\") pod \"community-operators-mh8gd\" (UID: \"cdb95fae-ff11-445b-afc5-d1b040c9bff9\") " pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.396152 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdb95fae-ff11-445b-afc5-d1b040c9bff9-utilities\") pod \"community-operators-mh8gd\" (UID: \"cdb95fae-ff11-445b-afc5-d1b040c9bff9\") " pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.396222 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmcqw\" (UniqueName: \"kubernetes.io/projected/cdb95fae-ff11-445b-afc5-d1b040c9bff9-kube-api-access-hmcqw\") pod \"community-operators-mh8gd\" (UID: \"cdb95fae-ff11-445b-afc5-d1b040c9bff9\") " pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.396299 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdb95fae-ff11-445b-afc5-d1b040c9bff9-catalog-content\") pod \"community-operators-mh8gd\" (UID: \"cdb95fae-ff11-445b-afc5-d1b040c9bff9\") " pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.396795 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdb95fae-ff11-445b-afc5-d1b040c9bff9-utilities\") pod \"community-operators-mh8gd\" (UID: \"cdb95fae-ff11-445b-afc5-d1b040c9bff9\") " pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.396905 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdb95fae-ff11-445b-afc5-d1b040c9bff9-catalog-content\") pod \"community-operators-mh8gd\" (UID: \"cdb95fae-ff11-445b-afc5-d1b040c9bff9\") " pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.420234 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmcqw\" (UniqueName: \"kubernetes.io/projected/cdb95fae-ff11-445b-afc5-d1b040c9bff9-kube-api-access-hmcqw\") pod \"community-operators-mh8gd\" (UID: \"cdb95fae-ff11-445b-afc5-d1b040c9bff9\") " pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.482328 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.662207 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-57nnf"] Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.936059 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mh8gd"] Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.936290 4781 generic.go:334] "Generic (PLEG): container finished" podID="70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81" containerID="62b853f00de1cd55d2c7cfbd91dc47d9a023e30fa964aa8d595fd0992366f3e8" exitCode=0 Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.936349 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57nnf" event={"ID":"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81","Type":"ContainerDied","Data":"62b853f00de1cd55d2c7cfbd91dc47d9a023e30fa964aa8d595fd0992366f3e8"} Dec 08 20:10:24 crc kubenswrapper[4781]: I1208 20:10:24.936389 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57nnf" event={"ID":"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81","Type":"ContainerStarted","Data":"57bc572cbfc8d4f5a81be79895e2351bf85103d4840ecba07b71038d57e10f0b"} Dec 08 20:10:24 crc kubenswrapper[4781]: W1208 20:10:24.951716 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdb95fae_ff11_445b_afc5_d1b040c9bff9.slice/crio-8e6e0cb79472091ab0a85184d563204b7cdbc28e3c87b1743a2ef359280a6feb WatchSource:0}: Error finding container 8e6e0cb79472091ab0a85184d563204b7cdbc28e3c87b1743a2ef359280a6feb: Status 404 returned error can't find the container with id 8e6e0cb79472091ab0a85184d563204b7cdbc28e3c87b1743a2ef359280a6feb Dec 08 20:10:25 crc kubenswrapper[4781]: I1208 20:10:25.943707 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57nnf" event={"ID":"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81","Type":"ContainerStarted","Data":"959990b45373cdcd6fad7f73ba1feacd77b9fe71c3db5f0ca6d0a2a7aaa285d0"} Dec 08 20:10:25 crc kubenswrapper[4781]: I1208 20:10:25.947075 4781 generic.go:334] "Generic (PLEG): container finished" podID="cdb95fae-ff11-445b-afc5-d1b040c9bff9" containerID="64d9d0f555fbff476b936d6310b3cf964821f240983fabde7a0a4c67afcc68bf" exitCode=0 Dec 08 20:10:25 crc kubenswrapper[4781]: I1208 20:10:25.947121 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mh8gd" event={"ID":"cdb95fae-ff11-445b-afc5-d1b040c9bff9","Type":"ContainerDied","Data":"64d9d0f555fbff476b936d6310b3cf964821f240983fabde7a0a4c67afcc68bf"} Dec 08 20:10:25 crc kubenswrapper[4781]: I1208 20:10:25.947164 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mh8gd" event={"ID":"cdb95fae-ff11-445b-afc5-d1b040c9bff9","Type":"ContainerStarted","Data":"8e6e0cb79472091ab0a85184d563204b7cdbc28e3c87b1743a2ef359280a6feb"} Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.370189 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bqh9n"] Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.372836 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.379007 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.397038 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bqh9n"] Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.527209 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db9d24ac-95dc-4952-bef0-ad643de86795-utilities\") pod \"redhat-operators-bqh9n\" (UID: \"db9d24ac-95dc-4952-bef0-ad643de86795\") " pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.527372 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db9d24ac-95dc-4952-bef0-ad643de86795-catalog-content\") pod \"redhat-operators-bqh9n\" (UID: \"db9d24ac-95dc-4952-bef0-ad643de86795\") " pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.527446 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfjg9\" (UniqueName: \"kubernetes.io/projected/db9d24ac-95dc-4952-bef0-ad643de86795-kube-api-access-tfjg9\") pod \"redhat-operators-bqh9n\" (UID: \"db9d24ac-95dc-4952-bef0-ad643de86795\") " pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.561238 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lpj2j"] Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.562254 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.564790 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.576637 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lpj2j"] Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.628478 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db9d24ac-95dc-4952-bef0-ad643de86795-catalog-content\") pod \"redhat-operators-bqh9n\" (UID: \"db9d24ac-95dc-4952-bef0-ad643de86795\") " pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.628545 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfjg9\" (UniqueName: \"kubernetes.io/projected/db9d24ac-95dc-4952-bef0-ad643de86795-kube-api-access-tfjg9\") pod \"redhat-operators-bqh9n\" (UID: \"db9d24ac-95dc-4952-bef0-ad643de86795\") " pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.628601 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db9d24ac-95dc-4952-bef0-ad643de86795-utilities\") pod \"redhat-operators-bqh9n\" (UID: \"db9d24ac-95dc-4952-bef0-ad643de86795\") " pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.628987 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db9d24ac-95dc-4952-bef0-ad643de86795-catalog-content\") pod \"redhat-operators-bqh9n\" (UID: \"db9d24ac-95dc-4952-bef0-ad643de86795\") " pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.629071 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db9d24ac-95dc-4952-bef0-ad643de86795-utilities\") pod \"redhat-operators-bqh9n\" (UID: \"db9d24ac-95dc-4952-bef0-ad643de86795\") " pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.645063 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfjg9\" (UniqueName: \"kubernetes.io/projected/db9d24ac-95dc-4952-bef0-ad643de86795-kube-api-access-tfjg9\") pod \"redhat-operators-bqh9n\" (UID: \"db9d24ac-95dc-4952-bef0-ad643de86795\") " pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.702983 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.729750 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac429e7-ae4c-4229-9c70-508f5ad917c8-catalog-content\") pod \"redhat-marketplace-lpj2j\" (UID: \"9ac429e7-ae4c-4229-9c70-508f5ad917c8\") " pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.729810 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac429e7-ae4c-4229-9c70-508f5ad917c8-utilities\") pod \"redhat-marketplace-lpj2j\" (UID: \"9ac429e7-ae4c-4229-9c70-508f5ad917c8\") " pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.729878 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ntsq\" (UniqueName: \"kubernetes.io/projected/9ac429e7-ae4c-4229-9c70-508f5ad917c8-kube-api-access-9ntsq\") pod \"redhat-marketplace-lpj2j\" (UID: \"9ac429e7-ae4c-4229-9c70-508f5ad917c8\") " pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.830997 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac429e7-ae4c-4229-9c70-508f5ad917c8-catalog-content\") pod \"redhat-marketplace-lpj2j\" (UID: \"9ac429e7-ae4c-4229-9c70-508f5ad917c8\") " pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.831054 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac429e7-ae4c-4229-9c70-508f5ad917c8-utilities\") pod \"redhat-marketplace-lpj2j\" (UID: \"9ac429e7-ae4c-4229-9c70-508f5ad917c8\") " pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.831127 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ntsq\" (UniqueName: \"kubernetes.io/projected/9ac429e7-ae4c-4229-9c70-508f5ad917c8-kube-api-access-9ntsq\") pod \"redhat-marketplace-lpj2j\" (UID: \"9ac429e7-ae4c-4229-9c70-508f5ad917c8\") " pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.831710 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac429e7-ae4c-4229-9c70-508f5ad917c8-catalog-content\") pod \"redhat-marketplace-lpj2j\" (UID: \"9ac429e7-ae4c-4229-9c70-508f5ad917c8\") " pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.832059 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac429e7-ae4c-4229-9c70-508f5ad917c8-utilities\") pod \"redhat-marketplace-lpj2j\" (UID: \"9ac429e7-ae4c-4229-9c70-508f5ad917c8\") " pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.852987 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ntsq\" (UniqueName: \"kubernetes.io/projected/9ac429e7-ae4c-4229-9c70-508f5ad917c8-kube-api-access-9ntsq\") pod \"redhat-marketplace-lpj2j\" (UID: \"9ac429e7-ae4c-4229-9c70-508f5ad917c8\") " pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.883445 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.958823 4781 generic.go:334] "Generic (PLEG): container finished" podID="70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81" containerID="959990b45373cdcd6fad7f73ba1feacd77b9fe71c3db5f0ca6d0a2a7aaa285d0" exitCode=0 Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.958888 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57nnf" event={"ID":"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81","Type":"ContainerDied","Data":"959990b45373cdcd6fad7f73ba1feacd77b9fe71c3db5f0ca6d0a2a7aaa285d0"} Dec 08 20:10:26 crc kubenswrapper[4781]: I1208 20:10:26.978674 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mh8gd" event={"ID":"cdb95fae-ff11-445b-afc5-d1b040c9bff9","Type":"ContainerStarted","Data":"4ae3fa2b8def19482cfcab26c56f6f7644abd2305c023306f11109884be65f5f"} Dec 08 20:10:27 crc kubenswrapper[4781]: I1208 20:10:27.114185 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bqh9n"] Dec 08 20:10:27 crc kubenswrapper[4781]: W1208 20:10:27.118119 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb9d24ac_95dc_4952_bef0_ad643de86795.slice/crio-46b2a4a122a9a2bf09abfafd09b429bcee5723415a9964c61561d9423faf4aea WatchSource:0}: Error finding container 46b2a4a122a9a2bf09abfafd09b429bcee5723415a9964c61561d9423faf4aea: Status 404 returned error can't find the container with id 46b2a4a122a9a2bf09abfafd09b429bcee5723415a9964c61561d9423faf4aea Dec 08 20:10:27 crc kubenswrapper[4781]: I1208 20:10:27.286894 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lpj2j"] Dec 08 20:10:27 crc kubenswrapper[4781]: W1208 20:10:27.291125 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ac429e7_ae4c_4229_9c70_508f5ad917c8.slice/crio-feb6c9e42a47fe0a2831577f8ff9065b3d6d4f3e830b18992af562bc23fb1235 WatchSource:0}: Error finding container feb6c9e42a47fe0a2831577f8ff9065b3d6d4f3e830b18992af562bc23fb1235: Status 404 returned error can't find the container with id feb6c9e42a47fe0a2831577f8ff9065b3d6d4f3e830b18992af562bc23fb1235 Dec 08 20:10:27 crc kubenswrapper[4781]: I1208 20:10:27.993817 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-57nnf" event={"ID":"70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81","Type":"ContainerStarted","Data":"5515f77443a8049fb241a6d7d9e210114122bdac88f32044f90e1fe3f78d0cbd"} Dec 08 20:10:27 crc kubenswrapper[4781]: I1208 20:10:27.995670 4781 generic.go:334] "Generic (PLEG): container finished" podID="9ac429e7-ae4c-4229-9c70-508f5ad917c8" containerID="5461158a22cf8b9701a2cf46a45fc03a3210ec366aabc2e24262b5bbffe78f39" exitCode=0 Dec 08 20:10:27 crc kubenswrapper[4781]: I1208 20:10:27.995745 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lpj2j" event={"ID":"9ac429e7-ae4c-4229-9c70-508f5ad917c8","Type":"ContainerDied","Data":"5461158a22cf8b9701a2cf46a45fc03a3210ec366aabc2e24262b5bbffe78f39"} Dec 08 20:10:27 crc kubenswrapper[4781]: I1208 20:10:27.995795 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lpj2j" event={"ID":"9ac429e7-ae4c-4229-9c70-508f5ad917c8","Type":"ContainerStarted","Data":"feb6c9e42a47fe0a2831577f8ff9065b3d6d4f3e830b18992af562bc23fb1235"} Dec 08 20:10:27 crc kubenswrapper[4781]: I1208 20:10:27.997570 4781 generic.go:334] "Generic (PLEG): container finished" podID="db9d24ac-95dc-4952-bef0-ad643de86795" containerID="5fca2b8859dd6eec8c947e2a360c5e8db6adea00f27479814a67cbd6e23ca2bf" exitCode=0 Dec 08 20:10:27 crc kubenswrapper[4781]: I1208 20:10:27.997648 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqh9n" event={"ID":"db9d24ac-95dc-4952-bef0-ad643de86795","Type":"ContainerDied","Data":"5fca2b8859dd6eec8c947e2a360c5e8db6adea00f27479814a67cbd6e23ca2bf"} Dec 08 20:10:27 crc kubenswrapper[4781]: I1208 20:10:27.997723 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqh9n" event={"ID":"db9d24ac-95dc-4952-bef0-ad643de86795","Type":"ContainerStarted","Data":"46b2a4a122a9a2bf09abfafd09b429bcee5723415a9964c61561d9423faf4aea"} Dec 08 20:10:28 crc kubenswrapper[4781]: I1208 20:10:28.001450 4781 generic.go:334] "Generic (PLEG): container finished" podID="cdb95fae-ff11-445b-afc5-d1b040c9bff9" containerID="4ae3fa2b8def19482cfcab26c56f6f7644abd2305c023306f11109884be65f5f" exitCode=0 Dec 08 20:10:28 crc kubenswrapper[4781]: I1208 20:10:28.001521 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mh8gd" event={"ID":"cdb95fae-ff11-445b-afc5-d1b040c9bff9","Type":"ContainerDied","Data":"4ae3fa2b8def19482cfcab26c56f6f7644abd2305c023306f11109884be65f5f"} Dec 08 20:10:28 crc kubenswrapper[4781]: I1208 20:10:28.012875 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-57nnf" podStartSLOduration=2.590134102 podStartE2EDuration="5.012847488s" podCreationTimestamp="2025-12-08 20:10:23 +0000 UTC" firstStartedPulling="2025-12-08 20:10:24.938245709 +0000 UTC m=+341.089529126" lastFinishedPulling="2025-12-08 20:10:27.360959135 +0000 UTC m=+343.512242512" observedRunningTime="2025-12-08 20:10:28.008465011 +0000 UTC m=+344.159748398" watchObservedRunningTime="2025-12-08 20:10:28.012847488 +0000 UTC m=+344.164130865" Dec 08 20:10:29 crc kubenswrapper[4781]: I1208 20:10:29.009835 4781 generic.go:334] "Generic (PLEG): container finished" podID="9ac429e7-ae4c-4229-9c70-508f5ad917c8" containerID="01bd13ff840aa008045ff5d09dc130a201a0adabe388400938857ba9d246a90c" exitCode=0 Dec 08 20:10:29 crc kubenswrapper[4781]: I1208 20:10:29.009880 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lpj2j" event={"ID":"9ac429e7-ae4c-4229-9c70-508f5ad917c8","Type":"ContainerDied","Data":"01bd13ff840aa008045ff5d09dc130a201a0adabe388400938857ba9d246a90c"} Dec 08 20:10:29 crc kubenswrapper[4781]: I1208 20:10:29.013111 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqh9n" event={"ID":"db9d24ac-95dc-4952-bef0-ad643de86795","Type":"ContainerStarted","Data":"3d7727d4d8e2275756ddbd9025c8b6ffcba04b9ffd7c7321ce2efffc714026c0"} Dec 08 20:10:29 crc kubenswrapper[4781]: I1208 20:10:29.015972 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mh8gd" event={"ID":"cdb95fae-ff11-445b-afc5-d1b040c9bff9","Type":"ContainerStarted","Data":"7855532db49baa876037b88e2d7c3bd3dc9e37138e9638e42edf6c640985e4e4"} Dec 08 20:10:29 crc kubenswrapper[4781]: I1208 20:10:29.048407 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mh8gd" podStartSLOduration=2.577911517 podStartE2EDuration="5.048379439s" podCreationTimestamp="2025-12-08 20:10:24 +0000 UTC" firstStartedPulling="2025-12-08 20:10:25.94847854 +0000 UTC m=+342.099761917" lastFinishedPulling="2025-12-08 20:10:28.418946422 +0000 UTC m=+344.570229839" observedRunningTime="2025-12-08 20:10:29.040679137 +0000 UTC m=+345.191962514" watchObservedRunningTime="2025-12-08 20:10:29.048379439 +0000 UTC m=+345.199662856" Dec 08 20:10:30 crc kubenswrapper[4781]: I1208 20:10:30.025061 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lpj2j" event={"ID":"9ac429e7-ae4c-4229-9c70-508f5ad917c8","Type":"ContainerStarted","Data":"6c962d12534c1e0bb2f27d87a162f902084e1884114621df9a6259a59b6196fc"} Dec 08 20:10:30 crc kubenswrapper[4781]: I1208 20:10:30.028882 4781 generic.go:334] "Generic (PLEG): container finished" podID="db9d24ac-95dc-4952-bef0-ad643de86795" containerID="3d7727d4d8e2275756ddbd9025c8b6ffcba04b9ffd7c7321ce2efffc714026c0" exitCode=0 Dec 08 20:10:30 crc kubenswrapper[4781]: I1208 20:10:30.028971 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqh9n" event={"ID":"db9d24ac-95dc-4952-bef0-ad643de86795","Type":"ContainerDied","Data":"3d7727d4d8e2275756ddbd9025c8b6ffcba04b9ffd7c7321ce2efffc714026c0"} Dec 08 20:10:30 crc kubenswrapper[4781]: I1208 20:10:30.045297 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lpj2j" podStartSLOduration=2.658506162 podStartE2EDuration="4.045279675s" podCreationTimestamp="2025-12-08 20:10:26 +0000 UTC" firstStartedPulling="2025-12-08 20:10:27.997972649 +0000 UTC m=+344.149256026" lastFinishedPulling="2025-12-08 20:10:29.384746162 +0000 UTC m=+345.536029539" observedRunningTime="2025-12-08 20:10:30.04336693 +0000 UTC m=+346.194650317" watchObservedRunningTime="2025-12-08 20:10:30.045279675 +0000 UTC m=+346.196563052" Dec 08 20:10:30 crc kubenswrapper[4781]: I1208 20:10:30.978563 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4s86h"] Dec 08 20:10:30 crc kubenswrapper[4781]: I1208 20:10:30.979507 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:30 crc kubenswrapper[4781]: I1208 20:10:30.989536 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4s86h"] Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.036167 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bqh9n" event={"ID":"db9d24ac-95dc-4952-bef0-ad643de86795","Type":"ContainerStarted","Data":"7563c7fe021089e6a7d734c57336c2d1e12f711cd7ae4b0e826ab24fd0757646"} Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.083906 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86gf8\" (UniqueName: \"kubernetes.io/projected/cdd7af85-5615-44a1-91e0-caadce9780a2-kube-api-access-86gf8\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.083991 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cdd7af85-5615-44a1-91e0-caadce9780a2-bound-sa-token\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.084015 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cdd7af85-5615-44a1-91e0-caadce9780a2-registry-certificates\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.084070 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cdd7af85-5615-44a1-91e0-caadce9780a2-trusted-ca\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.084253 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.084317 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cdd7af85-5615-44a1-91e0-caadce9780a2-registry-tls\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.084345 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cdd7af85-5615-44a1-91e0-caadce9780a2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.084372 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cdd7af85-5615-44a1-91e0-caadce9780a2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.118701 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.185912 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cdd7af85-5615-44a1-91e0-caadce9780a2-bound-sa-token\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.185978 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cdd7af85-5615-44a1-91e0-caadce9780a2-registry-certificates\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.186027 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cdd7af85-5615-44a1-91e0-caadce9780a2-trusted-ca\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.186092 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cdd7af85-5615-44a1-91e0-caadce9780a2-registry-tls\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.186125 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cdd7af85-5615-44a1-91e0-caadce9780a2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.186146 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cdd7af85-5615-44a1-91e0-caadce9780a2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.186202 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86gf8\" (UniqueName: \"kubernetes.io/projected/cdd7af85-5615-44a1-91e0-caadce9780a2-kube-api-access-86gf8\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.186888 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cdd7af85-5615-44a1-91e0-caadce9780a2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.187078 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cdd7af85-5615-44a1-91e0-caadce9780a2-registry-certificates\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.187526 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cdd7af85-5615-44a1-91e0-caadce9780a2-trusted-ca\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.196516 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cdd7af85-5615-44a1-91e0-caadce9780a2-registry-tls\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.196517 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cdd7af85-5615-44a1-91e0-caadce9780a2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.201717 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86gf8\" (UniqueName: \"kubernetes.io/projected/cdd7af85-5615-44a1-91e0-caadce9780a2-kube-api-access-86gf8\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.201785 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cdd7af85-5615-44a1-91e0-caadce9780a2-bound-sa-token\") pod \"image-registry-66df7c8f76-4s86h\" (UID: \"cdd7af85-5615-44a1-91e0-caadce9780a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.292155 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.701112 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bqh9n" podStartSLOduration=3.244619819 podStartE2EDuration="5.701093828s" podCreationTimestamp="2025-12-08 20:10:26 +0000 UTC" firstStartedPulling="2025-12-08 20:10:27.999283497 +0000 UTC m=+344.150566874" lastFinishedPulling="2025-12-08 20:10:30.455757506 +0000 UTC m=+346.607040883" observedRunningTime="2025-12-08 20:10:31.054258941 +0000 UTC m=+347.205542318" watchObservedRunningTime="2025-12-08 20:10:31.701093828 +0000 UTC m=+347.852377205" Dec 08 20:10:31 crc kubenswrapper[4781]: I1208 20:10:31.702472 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-4s86h"] Dec 08 20:10:31 crc kubenswrapper[4781]: W1208 20:10:31.712188 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdd7af85_5615_44a1_91e0_caadce9780a2.slice/crio-b34e2ca92f9cb28c0131679777fdaf78bca9cc9564eed5b82926ea91645cb673 WatchSource:0}: Error finding container b34e2ca92f9cb28c0131679777fdaf78bca9cc9564eed5b82926ea91645cb673: Status 404 returned error can't find the container with id b34e2ca92f9cb28c0131679777fdaf78bca9cc9564eed5b82926ea91645cb673 Dec 08 20:10:32 crc kubenswrapper[4781]: I1208 20:10:32.042217 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" event={"ID":"cdd7af85-5615-44a1-91e0-caadce9780a2","Type":"ContainerStarted","Data":"b34e2ca92f9cb28c0131679777fdaf78bca9cc9564eed5b82926ea91645cb673"} Dec 08 20:10:33 crc kubenswrapper[4781]: I1208 20:10:33.048819 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" event={"ID":"cdd7af85-5615-44a1-91e0-caadce9780a2","Type":"ContainerStarted","Data":"a14d4e4c1aaed4dc560e86d2e21042abcebae8f9f03577653e15af74067f0df9"} Dec 08 20:10:33 crc kubenswrapper[4781]: I1208 20:10:33.048928 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:33 crc kubenswrapper[4781]: I1208 20:10:33.072975 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" podStartSLOduration=3.072959961 podStartE2EDuration="3.072959961s" podCreationTimestamp="2025-12-08 20:10:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:10:33.068378619 +0000 UTC m=+349.219661996" watchObservedRunningTime="2025-12-08 20:10:33.072959961 +0000 UTC m=+349.224243338" Dec 08 20:10:34 crc kubenswrapper[4781]: I1208 20:10:34.289599 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:34 crc kubenswrapper[4781]: I1208 20:10:34.289937 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:34 crc kubenswrapper[4781]: I1208 20:10:34.326566 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:34 crc kubenswrapper[4781]: I1208 20:10:34.482955 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:34 crc kubenswrapper[4781]: I1208 20:10:34.483259 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:34 crc kubenswrapper[4781]: I1208 20:10:34.537598 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:35 crc kubenswrapper[4781]: I1208 20:10:35.094845 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mh8gd" Dec 08 20:10:35 crc kubenswrapper[4781]: I1208 20:10:35.127844 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-57nnf" Dec 08 20:10:36 crc kubenswrapper[4781]: I1208 20:10:36.703443 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:36 crc kubenswrapper[4781]: I1208 20:10:36.703777 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:36 crc kubenswrapper[4781]: I1208 20:10:36.758545 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:36 crc kubenswrapper[4781]: I1208 20:10:36.884179 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:36 crc kubenswrapper[4781]: I1208 20:10:36.884226 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:36 crc kubenswrapper[4781]: I1208 20:10:36.943099 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:37 crc kubenswrapper[4781]: I1208 20:10:37.100870 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bqh9n" Dec 08 20:10:37 crc kubenswrapper[4781]: I1208 20:10:37.106070 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lpj2j" Dec 08 20:10:37 crc kubenswrapper[4781]: I1208 20:10:37.411723 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c5c8764-4nv9l"] Dec 08 20:10:37 crc kubenswrapper[4781]: I1208 20:10:37.412145 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" podUID="dc53af71-600e-4622-ba7e-d0048ae8c57b" containerName="controller-manager" containerID="cri-o://88c6a167865fe7d51770d35223e6dc428b7ce6d1a9d7b28ffa17db2dd63e6b41" gracePeriod=30 Dec 08 20:10:37 crc kubenswrapper[4781]: I1208 20:10:37.892645 4781 patch_prober.go:28] interesting pod/controller-manager-6c5c8764-4nv9l container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Dec 08 20:10:37 crc kubenswrapper[4781]: I1208 20:10:37.892740 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" podUID="dc53af71-600e-4622-ba7e-d0048ae8c57b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.008640 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.042699 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79c844f6b8-kqjcv"] Dec 08 20:10:39 crc kubenswrapper[4781]: E1208 20:10:39.043103 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc53af71-600e-4622-ba7e-d0048ae8c57b" containerName="controller-manager" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.043127 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc53af71-600e-4622-ba7e-d0048ae8c57b" containerName="controller-manager" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.043350 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc53af71-600e-4622-ba7e-d0048ae8c57b" containerName="controller-manager" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.044675 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.051319 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79c844f6b8-kqjcv"] Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.077840 4781 generic.go:334] "Generic (PLEG): container finished" podID="dc53af71-600e-4622-ba7e-d0048ae8c57b" containerID="88c6a167865fe7d51770d35223e6dc428b7ce6d1a9d7b28ffa17db2dd63e6b41" exitCode=0 Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.077893 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" event={"ID":"dc53af71-600e-4622-ba7e-d0048ae8c57b","Type":"ContainerDied","Data":"88c6a167865fe7d51770d35223e6dc428b7ce6d1a9d7b28ffa17db2dd63e6b41"} Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.077975 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" event={"ID":"dc53af71-600e-4622-ba7e-d0048ae8c57b","Type":"ContainerDied","Data":"650cdf14a2dae479cc9cb8f021235c13b8525a46410edb6788358514c5dce97e"} Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.077998 4781 scope.go:117] "RemoveContainer" containerID="88c6a167865fe7d51770d35223e6dc428b7ce6d1a9d7b28ffa17db2dd63e6b41" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.077930 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c5c8764-4nv9l" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.095246 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjhp7\" (UniqueName: \"kubernetes.io/projected/dc53af71-600e-4622-ba7e-d0048ae8c57b-kube-api-access-kjhp7\") pod \"dc53af71-600e-4622-ba7e-d0048ae8c57b\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.095341 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-proxy-ca-bundles\") pod \"dc53af71-600e-4622-ba7e-d0048ae8c57b\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.095396 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc53af71-600e-4622-ba7e-d0048ae8c57b-serving-cert\") pod \"dc53af71-600e-4622-ba7e-d0048ae8c57b\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.095430 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-config\") pod \"dc53af71-600e-4622-ba7e-d0048ae8c57b\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.095450 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-client-ca\") pod \"dc53af71-600e-4622-ba7e-d0048ae8c57b\" (UID: \"dc53af71-600e-4622-ba7e-d0048ae8c57b\") " Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.096113 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-client-ca" (OuterVolumeSpecName: "client-ca") pod "dc53af71-600e-4622-ba7e-d0048ae8c57b" (UID: "dc53af71-600e-4622-ba7e-d0048ae8c57b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.096330 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.096590 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "dc53af71-600e-4622-ba7e-d0048ae8c57b" (UID: "dc53af71-600e-4622-ba7e-d0048ae8c57b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.097013 4781 scope.go:117] "RemoveContainer" containerID="88c6a167865fe7d51770d35223e6dc428b7ce6d1a9d7b28ffa17db2dd63e6b41" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.097221 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-config" (OuterVolumeSpecName: "config") pod "dc53af71-600e-4622-ba7e-d0048ae8c57b" (UID: "dc53af71-600e-4622-ba7e-d0048ae8c57b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:10:39 crc kubenswrapper[4781]: E1208 20:10:39.097632 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88c6a167865fe7d51770d35223e6dc428b7ce6d1a9d7b28ffa17db2dd63e6b41\": container with ID starting with 88c6a167865fe7d51770d35223e6dc428b7ce6d1a9d7b28ffa17db2dd63e6b41 not found: ID does not exist" containerID="88c6a167865fe7d51770d35223e6dc428b7ce6d1a9d7b28ffa17db2dd63e6b41" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.097665 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88c6a167865fe7d51770d35223e6dc428b7ce6d1a9d7b28ffa17db2dd63e6b41"} err="failed to get container status \"88c6a167865fe7d51770d35223e6dc428b7ce6d1a9d7b28ffa17db2dd63e6b41\": rpc error: code = NotFound desc = could not find container \"88c6a167865fe7d51770d35223e6dc428b7ce6d1a9d7b28ffa17db2dd63e6b41\": container with ID starting with 88c6a167865fe7d51770d35223e6dc428b7ce6d1a9d7b28ffa17db2dd63e6b41 not found: ID does not exist" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.099751 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc53af71-600e-4622-ba7e-d0048ae8c57b-kube-api-access-kjhp7" (OuterVolumeSpecName: "kube-api-access-kjhp7") pod "dc53af71-600e-4622-ba7e-d0048ae8c57b" (UID: "dc53af71-600e-4622-ba7e-d0048ae8c57b"). InnerVolumeSpecName "kube-api-access-kjhp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.100105 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc53af71-600e-4622-ba7e-d0048ae8c57b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dc53af71-600e-4622-ba7e-d0048ae8c57b" (UID: "dc53af71-600e-4622-ba7e-d0048ae8c57b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.197786 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06991671-87f2-4092-87c1-5172b2c16da6-config\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.197845 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46f8v\" (UniqueName: \"kubernetes.io/projected/06991671-87f2-4092-87c1-5172b2c16da6-kube-api-access-46f8v\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.197875 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06991671-87f2-4092-87c1-5172b2c16da6-client-ca\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.197929 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06991671-87f2-4092-87c1-5172b2c16da6-serving-cert\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.198102 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/06991671-87f2-4092-87c1-5172b2c16da6-proxy-ca-bundles\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.198331 4781 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.198349 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dc53af71-600e-4622-ba7e-d0048ae8c57b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.198361 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc53af71-600e-4622-ba7e-d0048ae8c57b-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.198375 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjhp7\" (UniqueName: \"kubernetes.io/projected/dc53af71-600e-4622-ba7e-d0048ae8c57b-kube-api-access-kjhp7\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.299599 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06991671-87f2-4092-87c1-5172b2c16da6-config\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.299668 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46f8v\" (UniqueName: \"kubernetes.io/projected/06991671-87f2-4092-87c1-5172b2c16da6-kube-api-access-46f8v\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.299703 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06991671-87f2-4092-87c1-5172b2c16da6-client-ca\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.299763 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06991671-87f2-4092-87c1-5172b2c16da6-serving-cert\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.299782 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/06991671-87f2-4092-87c1-5172b2c16da6-proxy-ca-bundles\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.300726 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/06991671-87f2-4092-87c1-5172b2c16da6-client-ca\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.301031 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06991671-87f2-4092-87c1-5172b2c16da6-config\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.302080 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/06991671-87f2-4092-87c1-5172b2c16da6-proxy-ca-bundles\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.304532 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06991671-87f2-4092-87c1-5172b2c16da6-serving-cert\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.315080 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46f8v\" (UniqueName: \"kubernetes.io/projected/06991671-87f2-4092-87c1-5172b2c16da6-kube-api-access-46f8v\") pod \"controller-manager-79c844f6b8-kqjcv\" (UID: \"06991671-87f2-4092-87c1-5172b2c16da6\") " pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.362832 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.414654 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c5c8764-4nv9l"] Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.419397 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c5c8764-4nv9l"] Dec 08 20:10:39 crc kubenswrapper[4781]: I1208 20:10:39.778428 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79c844f6b8-kqjcv"] Dec 08 20:10:40 crc kubenswrapper[4781]: I1208 20:10:40.085792 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" event={"ID":"06991671-87f2-4092-87c1-5172b2c16da6","Type":"ContainerStarted","Data":"7d4a262bff12d15ff545388a8f4ce6cb053b6be57eefb9006deb2e405bbd76e7"} Dec 08 20:10:40 crc kubenswrapper[4781]: I1208 20:10:40.086131 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" event={"ID":"06991671-87f2-4092-87c1-5172b2c16da6","Type":"ContainerStarted","Data":"ea826e9aaaa6270dbd33170e48ba84d678750fe5388aca4e852783031462a261"} Dec 08 20:10:40 crc kubenswrapper[4781]: I1208 20:10:40.086160 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:40 crc kubenswrapper[4781]: I1208 20:10:40.093358 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" Dec 08 20:10:40 crc kubenswrapper[4781]: I1208 20:10:40.109286 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79c844f6b8-kqjcv" podStartSLOduration=3.109267188 podStartE2EDuration="3.109267188s" podCreationTimestamp="2025-12-08 20:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:10:40.103532893 +0000 UTC m=+356.254816270" watchObservedRunningTime="2025-12-08 20:10:40.109267188 +0000 UTC m=+356.260550565" Dec 08 20:10:40 crc kubenswrapper[4781]: I1208 20:10:40.137495 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc53af71-600e-4622-ba7e-d0048ae8c57b" path="/var/lib/kubelet/pods/dc53af71-600e-4622-ba7e-d0048ae8c57b/volumes" Dec 08 20:10:51 crc kubenswrapper[4781]: I1208 20:10:51.297383 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-4s86h" Dec 08 20:10:51 crc kubenswrapper[4781]: I1208 20:10:51.351021 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dh9vd"] Dec 08 20:10:57 crc kubenswrapper[4781]: I1208 20:10:57.389445 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94"] Dec 08 20:10:57 crc kubenswrapper[4781]: I1208 20:10:57.390030 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" podUID="482605bb-e881-4901-b2c5-ff076f3de586" containerName="route-controller-manager" containerID="cri-o://e7f271ac07ec1cc2178270ee8eb7080b47c327028e157cb5a8f71c4c0dfda1b2" gracePeriod=30 Dec 08 20:10:57 crc kubenswrapper[4781]: I1208 20:10:57.824294 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:57 crc kubenswrapper[4781]: I1208 20:10:57.966552 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g29rv\" (UniqueName: \"kubernetes.io/projected/482605bb-e881-4901-b2c5-ff076f3de586-kube-api-access-g29rv\") pod \"482605bb-e881-4901-b2c5-ff076f3de586\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " Dec 08 20:10:57 crc kubenswrapper[4781]: I1208 20:10:57.966688 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482605bb-e881-4901-b2c5-ff076f3de586-client-ca\") pod \"482605bb-e881-4901-b2c5-ff076f3de586\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " Dec 08 20:10:57 crc kubenswrapper[4781]: I1208 20:10:57.966712 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482605bb-e881-4901-b2c5-ff076f3de586-config\") pod \"482605bb-e881-4901-b2c5-ff076f3de586\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " Dec 08 20:10:57 crc kubenswrapper[4781]: I1208 20:10:57.966742 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482605bb-e881-4901-b2c5-ff076f3de586-serving-cert\") pod \"482605bb-e881-4901-b2c5-ff076f3de586\" (UID: \"482605bb-e881-4901-b2c5-ff076f3de586\") " Dec 08 20:10:57 crc kubenswrapper[4781]: I1208 20:10:57.967668 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/482605bb-e881-4901-b2c5-ff076f3de586-config" (OuterVolumeSpecName: "config") pod "482605bb-e881-4901-b2c5-ff076f3de586" (UID: "482605bb-e881-4901-b2c5-ff076f3de586"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:10:57 crc kubenswrapper[4781]: I1208 20:10:57.968009 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/482605bb-e881-4901-b2c5-ff076f3de586-client-ca" (OuterVolumeSpecName: "client-ca") pod "482605bb-e881-4901-b2c5-ff076f3de586" (UID: "482605bb-e881-4901-b2c5-ff076f3de586"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:10:57 crc kubenswrapper[4781]: I1208 20:10:57.971957 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/482605bb-e881-4901-b2c5-ff076f3de586-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "482605bb-e881-4901-b2c5-ff076f3de586" (UID: "482605bb-e881-4901-b2c5-ff076f3de586"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:10:57 crc kubenswrapper[4781]: I1208 20:10:57.972159 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/482605bb-e881-4901-b2c5-ff076f3de586-kube-api-access-g29rv" (OuterVolumeSpecName: "kube-api-access-g29rv") pod "482605bb-e881-4901-b2c5-ff076f3de586" (UID: "482605bb-e881-4901-b2c5-ff076f3de586"). InnerVolumeSpecName "kube-api-access-g29rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.067996 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g29rv\" (UniqueName: \"kubernetes.io/projected/482605bb-e881-4901-b2c5-ff076f3de586-kube-api-access-g29rv\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.068039 4781 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/482605bb-e881-4901-b2c5-ff076f3de586-client-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.068057 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/482605bb-e881-4901-b2c5-ff076f3de586-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.068066 4781 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/482605bb-e881-4901-b2c5-ff076f3de586-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.196022 4781 generic.go:334] "Generic (PLEG): container finished" podID="482605bb-e881-4901-b2c5-ff076f3de586" containerID="e7f271ac07ec1cc2178270ee8eb7080b47c327028e157cb5a8f71c4c0dfda1b2" exitCode=0 Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.196104 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" event={"ID":"482605bb-e881-4901-b2c5-ff076f3de586","Type":"ContainerDied","Data":"e7f271ac07ec1cc2178270ee8eb7080b47c327028e157cb5a8f71c4c0dfda1b2"} Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.196155 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" event={"ID":"482605bb-e881-4901-b2c5-ff076f3de586","Type":"ContainerDied","Data":"c73aa3f69b90315ac9c0e2aea990d19690319375838d2e572beebe88253da7b0"} Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.196194 4781 scope.go:117] "RemoveContainer" containerID="e7f271ac07ec1cc2178270ee8eb7080b47c327028e157cb5a8f71c4c0dfda1b2" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.196407 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.232781 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94"] Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.232852 4781 scope.go:117] "RemoveContainer" containerID="e7f271ac07ec1cc2178270ee8eb7080b47c327028e157cb5a8f71c4c0dfda1b2" Dec 08 20:10:58 crc kubenswrapper[4781]: E1208 20:10:58.233366 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7f271ac07ec1cc2178270ee8eb7080b47c327028e157cb5a8f71c4c0dfda1b2\": container with ID starting with e7f271ac07ec1cc2178270ee8eb7080b47c327028e157cb5a8f71c4c0dfda1b2 not found: ID does not exist" containerID="e7f271ac07ec1cc2178270ee8eb7080b47c327028e157cb5a8f71c4c0dfda1b2" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.233421 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f271ac07ec1cc2178270ee8eb7080b47c327028e157cb5a8f71c4c0dfda1b2"} err="failed to get container status \"e7f271ac07ec1cc2178270ee8eb7080b47c327028e157cb5a8f71c4c0dfda1b2\": rpc error: code = NotFound desc = could not find container \"e7f271ac07ec1cc2178270ee8eb7080b47c327028e157cb5a8f71c4c0dfda1b2\": container with ID starting with e7f271ac07ec1cc2178270ee8eb7080b47c327028e157cb5a8f71c4c0dfda1b2 not found: ID does not exist" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.238750 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6856fbf746-8rk94"] Dec 08 20:10:58 crc kubenswrapper[4781]: E1208 20:10:58.268598 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482605bb_e881_4901_b2c5_ff076f3de586.slice/crio-c73aa3f69b90315ac9c0e2aea990d19690319375838d2e572beebe88253da7b0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482605bb_e881_4901_b2c5_ff076f3de586.slice\": RecentStats: unable to find data in memory cache]" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.592240 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb"] Dec 08 20:10:58 crc kubenswrapper[4781]: E1208 20:10:58.592501 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="482605bb-e881-4901-b2c5-ff076f3de586" containerName="route-controller-manager" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.592514 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="482605bb-e881-4901-b2c5-ff076f3de586" containerName="route-controller-manager" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.592604 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="482605bb-e881-4901-b2c5-ff076f3de586" containerName="route-controller-manager" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.593019 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.594762 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.594942 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.595075 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.595268 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.595376 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.595485 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.607657 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb"] Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.780037 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4lq7\" (UniqueName: \"kubernetes.io/projected/b90141e3-da67-4fcc-8704-6c4cee15eafb-kube-api-access-z4lq7\") pod \"route-controller-manager-769d54f96d-8d5cb\" (UID: \"b90141e3-da67-4fcc-8704-6c4cee15eafb\") " pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.780126 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b90141e3-da67-4fcc-8704-6c4cee15eafb-client-ca\") pod \"route-controller-manager-769d54f96d-8d5cb\" (UID: \"b90141e3-da67-4fcc-8704-6c4cee15eafb\") " pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.780184 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b90141e3-da67-4fcc-8704-6c4cee15eafb-serving-cert\") pod \"route-controller-manager-769d54f96d-8d5cb\" (UID: \"b90141e3-da67-4fcc-8704-6c4cee15eafb\") " pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.780230 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b90141e3-da67-4fcc-8704-6c4cee15eafb-config\") pod \"route-controller-manager-769d54f96d-8d5cb\" (UID: \"b90141e3-da67-4fcc-8704-6c4cee15eafb\") " pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.882254 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b90141e3-da67-4fcc-8704-6c4cee15eafb-client-ca\") pod \"route-controller-manager-769d54f96d-8d5cb\" (UID: \"b90141e3-da67-4fcc-8704-6c4cee15eafb\") " pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.882630 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b90141e3-da67-4fcc-8704-6c4cee15eafb-serving-cert\") pod \"route-controller-manager-769d54f96d-8d5cb\" (UID: \"b90141e3-da67-4fcc-8704-6c4cee15eafb\") " pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.882669 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b90141e3-da67-4fcc-8704-6c4cee15eafb-config\") pod \"route-controller-manager-769d54f96d-8d5cb\" (UID: \"b90141e3-da67-4fcc-8704-6c4cee15eafb\") " pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.882740 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4lq7\" (UniqueName: \"kubernetes.io/projected/b90141e3-da67-4fcc-8704-6c4cee15eafb-kube-api-access-z4lq7\") pod \"route-controller-manager-769d54f96d-8d5cb\" (UID: \"b90141e3-da67-4fcc-8704-6c4cee15eafb\") " pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.883221 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b90141e3-da67-4fcc-8704-6c4cee15eafb-client-ca\") pod \"route-controller-manager-769d54f96d-8d5cb\" (UID: \"b90141e3-da67-4fcc-8704-6c4cee15eafb\") " pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.883857 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b90141e3-da67-4fcc-8704-6c4cee15eafb-config\") pod \"route-controller-manager-769d54f96d-8d5cb\" (UID: \"b90141e3-da67-4fcc-8704-6c4cee15eafb\") " pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.886646 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b90141e3-da67-4fcc-8704-6c4cee15eafb-serving-cert\") pod \"route-controller-manager-769d54f96d-8d5cb\" (UID: \"b90141e3-da67-4fcc-8704-6c4cee15eafb\") " pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.901866 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4lq7\" (UniqueName: \"kubernetes.io/projected/b90141e3-da67-4fcc-8704-6c4cee15eafb-kube-api-access-z4lq7\") pod \"route-controller-manager-769d54f96d-8d5cb\" (UID: \"b90141e3-da67-4fcc-8704-6c4cee15eafb\") " pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:58 crc kubenswrapper[4781]: I1208 20:10:58.910413 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:10:59 crc kubenswrapper[4781]: I1208 20:10:59.296382 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb"] Dec 08 20:10:59 crc kubenswrapper[4781]: W1208 20:10:59.301660 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb90141e3_da67_4fcc_8704_6c4cee15eafb.slice/crio-114feadb01099180bce3dfc7f42b0806e126379d1db319e1d14a6c7eb721f4e8 WatchSource:0}: Error finding container 114feadb01099180bce3dfc7f42b0806e126379d1db319e1d14a6c7eb721f4e8: Status 404 returned error can't find the container with id 114feadb01099180bce3dfc7f42b0806e126379d1db319e1d14a6c7eb721f4e8 Dec 08 20:10:59 crc kubenswrapper[4781]: I1208 20:10:59.948380 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:10:59 crc kubenswrapper[4781]: I1208 20:10:59.948847 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:11:00 crc kubenswrapper[4781]: I1208 20:11:00.134121 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="482605bb-e881-4901-b2c5-ff076f3de586" path="/var/lib/kubelet/pods/482605bb-e881-4901-b2c5-ff076f3de586/volumes" Dec 08 20:11:00 crc kubenswrapper[4781]: I1208 20:11:00.209468 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" event={"ID":"b90141e3-da67-4fcc-8704-6c4cee15eafb","Type":"ContainerStarted","Data":"8759909c752e37efe3a64e6992623cbbaa904539d499a32c4bb4823ec503d19f"} Dec 08 20:11:00 crc kubenswrapper[4781]: I1208 20:11:00.209532 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" event={"ID":"b90141e3-da67-4fcc-8704-6c4cee15eafb","Type":"ContainerStarted","Data":"114feadb01099180bce3dfc7f42b0806e126379d1db319e1d14a6c7eb721f4e8"} Dec 08 20:11:00 crc kubenswrapper[4781]: I1208 20:11:00.209769 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:11:00 crc kubenswrapper[4781]: I1208 20:11:00.215939 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" Dec 08 20:11:00 crc kubenswrapper[4781]: I1208 20:11:00.232567 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-769d54f96d-8d5cb" podStartSLOduration=3.232535768 podStartE2EDuration="3.232535768s" podCreationTimestamp="2025-12-08 20:10:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:11:00.22949348 +0000 UTC m=+376.380776937" watchObservedRunningTime="2025-12-08 20:11:00.232535768 +0000 UTC m=+376.383819185" Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.394317 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" podUID="bc1243b0-6d24-4282-a3e7-c1c87296ca09" containerName="registry" containerID="cri-o://df58bed99f0dfb333830e9813ce600d86742c49e8ed3142b9c1974ed5e17666d" gracePeriod=30 Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.764390 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.917564 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bc1243b0-6d24-4282-a3e7-c1c87296ca09-ca-trust-extracted\") pod \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.917626 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bc1243b0-6d24-4282-a3e7-c1c87296ca09-installation-pull-secrets\") pod \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.917753 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.917778 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc1243b0-6d24-4282-a3e7-c1c87296ca09-trusted-ca\") pod \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.917848 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-bound-sa-token\") pod \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.917875 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-registry-tls\") pod \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.917891 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bc1243b0-6d24-4282-a3e7-c1c87296ca09-registry-certificates\") pod \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.917941 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvp5w\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-kube-api-access-mvp5w\") pod \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\" (UID: \"bc1243b0-6d24-4282-a3e7-c1c87296ca09\") " Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.918622 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc1243b0-6d24-4282-a3e7-c1c87296ca09-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bc1243b0-6d24-4282-a3e7-c1c87296ca09" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.918674 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc1243b0-6d24-4282-a3e7-c1c87296ca09-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bc1243b0-6d24-4282-a3e7-c1c87296ca09" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.924218 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-kube-api-access-mvp5w" (OuterVolumeSpecName: "kube-api-access-mvp5w") pod "bc1243b0-6d24-4282-a3e7-c1c87296ca09" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09"). InnerVolumeSpecName "kube-api-access-mvp5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.925832 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc1243b0-6d24-4282-a3e7-c1c87296ca09-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bc1243b0-6d24-4282-a3e7-c1c87296ca09" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.926463 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bc1243b0-6d24-4282-a3e7-c1c87296ca09" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.931449 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bc1243b0-6d24-4282-a3e7-c1c87296ca09" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.932297 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "bc1243b0-6d24-4282-a3e7-c1c87296ca09" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 08 20:11:16 crc kubenswrapper[4781]: I1208 20:11:16.935654 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc1243b0-6d24-4282-a3e7-c1c87296ca09-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bc1243b0-6d24-4282-a3e7-c1c87296ca09" (UID: "bc1243b0-6d24-4282-a3e7-c1c87296ca09"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.019622 4781 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.019651 4781 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.019661 4781 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bc1243b0-6d24-4282-a3e7-c1c87296ca09-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.019671 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvp5w\" (UniqueName: \"kubernetes.io/projected/bc1243b0-6d24-4282-a3e7-c1c87296ca09-kube-api-access-mvp5w\") on node \"crc\" DevicePath \"\"" Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.019681 4781 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bc1243b0-6d24-4282-a3e7-c1c87296ca09-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.019689 4781 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bc1243b0-6d24-4282-a3e7-c1c87296ca09-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.019703 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bc1243b0-6d24-4282-a3e7-c1c87296ca09-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.307130 4781 generic.go:334] "Generic (PLEG): container finished" podID="bc1243b0-6d24-4282-a3e7-c1c87296ca09" containerID="df58bed99f0dfb333830e9813ce600d86742c49e8ed3142b9c1974ed5e17666d" exitCode=0 Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.307212 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" event={"ID":"bc1243b0-6d24-4282-a3e7-c1c87296ca09","Type":"ContainerDied","Data":"df58bed99f0dfb333830e9813ce600d86742c49e8ed3142b9c1974ed5e17666d"} Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.307258 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.307293 4781 scope.go:117] "RemoveContainer" containerID="df58bed99f0dfb333830e9813ce600d86742c49e8ed3142b9c1974ed5e17666d" Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.307272 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dh9vd" event={"ID":"bc1243b0-6d24-4282-a3e7-c1c87296ca09","Type":"ContainerDied","Data":"3cadc4f31b505832a820aca930b9771faa7b6a69781ea9d86946865b7c239fc7"} Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.334350 4781 scope.go:117] "RemoveContainer" containerID="df58bed99f0dfb333830e9813ce600d86742c49e8ed3142b9c1974ed5e17666d" Dec 08 20:11:17 crc kubenswrapper[4781]: E1208 20:11:17.334819 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df58bed99f0dfb333830e9813ce600d86742c49e8ed3142b9c1974ed5e17666d\": container with ID starting with df58bed99f0dfb333830e9813ce600d86742c49e8ed3142b9c1974ed5e17666d not found: ID does not exist" containerID="df58bed99f0dfb333830e9813ce600d86742c49e8ed3142b9c1974ed5e17666d" Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.334862 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df58bed99f0dfb333830e9813ce600d86742c49e8ed3142b9c1974ed5e17666d"} err="failed to get container status \"df58bed99f0dfb333830e9813ce600d86742c49e8ed3142b9c1974ed5e17666d\": rpc error: code = NotFound desc = could not find container \"df58bed99f0dfb333830e9813ce600d86742c49e8ed3142b9c1974ed5e17666d\": container with ID starting with df58bed99f0dfb333830e9813ce600d86742c49e8ed3142b9c1974ed5e17666d not found: ID does not exist" Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.365315 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dh9vd"] Dec 08 20:11:17 crc kubenswrapper[4781]: I1208 20:11:17.369941 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dh9vd"] Dec 08 20:11:18 crc kubenswrapper[4781]: I1208 20:11:18.137690 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc1243b0-6d24-4282-a3e7-c1c87296ca09" path="/var/lib/kubelet/pods/bc1243b0-6d24-4282-a3e7-c1c87296ca09/volumes" Dec 08 20:11:29 crc kubenswrapper[4781]: I1208 20:11:29.948706 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:11:29 crc kubenswrapper[4781]: I1208 20:11:29.949147 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:11:59 crc kubenswrapper[4781]: I1208 20:11:59.948546 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:11:59 crc kubenswrapper[4781]: I1208 20:11:59.949086 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:11:59 crc kubenswrapper[4781]: I1208 20:11:59.949179 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:11:59 crc kubenswrapper[4781]: I1208 20:11:59.949792 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"35d2e280105962572e3cb8a545c5f1fabd05635f164199f39e55da29ab8b26d5"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 20:11:59 crc kubenswrapper[4781]: I1208 20:11:59.949855 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://35d2e280105962572e3cb8a545c5f1fabd05635f164199f39e55da29ab8b26d5" gracePeriod=600 Dec 08 20:12:00 crc kubenswrapper[4781]: I1208 20:12:00.548982 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="35d2e280105962572e3cb8a545c5f1fabd05635f164199f39e55da29ab8b26d5" exitCode=0 Dec 08 20:12:00 crc kubenswrapper[4781]: I1208 20:12:00.549031 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"35d2e280105962572e3cb8a545c5f1fabd05635f164199f39e55da29ab8b26d5"} Dec 08 20:12:00 crc kubenswrapper[4781]: I1208 20:12:00.549358 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"dfd2ead44684e651bc28c73f98c7b6fe01726a0e85f06c1edad8116b86010483"} Dec 08 20:12:00 crc kubenswrapper[4781]: I1208 20:12:00.549397 4781 scope.go:117] "RemoveContainer" containerID="5f5c1dbeb25e3d3f3b653aa1e243d7d9f0e67f780a1fd11bb3197981d45b3c01" Dec 08 20:13:44 crc kubenswrapper[4781]: I1208 20:13:44.326058 4781 scope.go:117] "RemoveContainer" containerID="42303b2e7bbea21dc48d531eba20b0aa2f68c084eeac93bbd228ef93031637b9" Dec 08 20:14:29 crc kubenswrapper[4781]: I1208 20:14:29.948142 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:14:29 crc kubenswrapper[4781]: I1208 20:14:29.948571 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:14:44 crc kubenswrapper[4781]: I1208 20:14:44.372102 4781 scope.go:117] "RemoveContainer" containerID="e6a1c44e81451f19d048840190b553f09570f34d4b2007700faa3a0303471b63" Dec 08 20:14:44 crc kubenswrapper[4781]: I1208 20:14:44.388945 4781 scope.go:117] "RemoveContainer" containerID="33adac9c2cbef41f0756bc8f865026136409054b83fadd53ebea6a6315d4fc67" Dec 08 20:14:59 crc kubenswrapper[4781]: I1208 20:14:59.948315 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:14:59 crc kubenswrapper[4781]: I1208 20:14:59.949036 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.176866 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs"] Dec 08 20:15:00 crc kubenswrapper[4781]: E1208 20:15:00.177198 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1243b0-6d24-4282-a3e7-c1c87296ca09" containerName="registry" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.177221 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1243b0-6d24-4282-a3e7-c1c87296ca09" containerName="registry" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.177386 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1243b0-6d24-4282-a3e7-c1c87296ca09" containerName="registry" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.177767 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.181879 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.181900 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.182434 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs"] Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.327518 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjtch\" (UniqueName: \"kubernetes.io/projected/d088452a-eba7-4ada-a1d1-312f3471960b-kube-api-access-xjtch\") pod \"collect-profiles-29420415-np6zs\" (UID: \"d088452a-eba7-4ada-a1d1-312f3471960b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.327687 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d088452a-eba7-4ada-a1d1-312f3471960b-secret-volume\") pod \"collect-profiles-29420415-np6zs\" (UID: \"d088452a-eba7-4ada-a1d1-312f3471960b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.327815 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d088452a-eba7-4ada-a1d1-312f3471960b-config-volume\") pod \"collect-profiles-29420415-np6zs\" (UID: \"d088452a-eba7-4ada-a1d1-312f3471960b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.428349 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjtch\" (UniqueName: \"kubernetes.io/projected/d088452a-eba7-4ada-a1d1-312f3471960b-kube-api-access-xjtch\") pod \"collect-profiles-29420415-np6zs\" (UID: \"d088452a-eba7-4ada-a1d1-312f3471960b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.428675 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d088452a-eba7-4ada-a1d1-312f3471960b-secret-volume\") pod \"collect-profiles-29420415-np6zs\" (UID: \"d088452a-eba7-4ada-a1d1-312f3471960b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.428867 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d088452a-eba7-4ada-a1d1-312f3471960b-config-volume\") pod \"collect-profiles-29420415-np6zs\" (UID: \"d088452a-eba7-4ada-a1d1-312f3471960b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.429814 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d088452a-eba7-4ada-a1d1-312f3471960b-config-volume\") pod \"collect-profiles-29420415-np6zs\" (UID: \"d088452a-eba7-4ada-a1d1-312f3471960b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.436186 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d088452a-eba7-4ada-a1d1-312f3471960b-secret-volume\") pod \"collect-profiles-29420415-np6zs\" (UID: \"d088452a-eba7-4ada-a1d1-312f3471960b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.446270 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjtch\" (UniqueName: \"kubernetes.io/projected/d088452a-eba7-4ada-a1d1-312f3471960b-kube-api-access-xjtch\") pod \"collect-profiles-29420415-np6zs\" (UID: \"d088452a-eba7-4ada-a1d1-312f3471960b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.495066 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.666355 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs"] Dec 08 20:15:00 crc kubenswrapper[4781]: I1208 20:15:00.753048 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" event={"ID":"d088452a-eba7-4ada-a1d1-312f3471960b","Type":"ContainerStarted","Data":"6e7b876bafd4dfd3289590feb277609550ebd0172bae9640fe58e9bd2691b975"} Dec 08 20:15:01 crc kubenswrapper[4781]: I1208 20:15:01.759040 4781 generic.go:334] "Generic (PLEG): container finished" podID="d088452a-eba7-4ada-a1d1-312f3471960b" containerID="024f33453b3f33f566d3d78ff09833f46fe509a1c6ee98970c86f0340f4ee7b0" exitCode=0 Dec 08 20:15:01 crc kubenswrapper[4781]: I1208 20:15:01.759120 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" event={"ID":"d088452a-eba7-4ada-a1d1-312f3471960b","Type":"ContainerDied","Data":"024f33453b3f33f566d3d78ff09833f46fe509a1c6ee98970c86f0340f4ee7b0"} Dec 08 20:15:02 crc kubenswrapper[4781]: I1208 20:15:02.959662 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" Dec 08 20:15:03 crc kubenswrapper[4781]: I1208 20:15:03.157850 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d088452a-eba7-4ada-a1d1-312f3471960b-secret-volume\") pod \"d088452a-eba7-4ada-a1d1-312f3471960b\" (UID: \"d088452a-eba7-4ada-a1d1-312f3471960b\") " Dec 08 20:15:03 crc kubenswrapper[4781]: I1208 20:15:03.157984 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d088452a-eba7-4ada-a1d1-312f3471960b-config-volume\") pod \"d088452a-eba7-4ada-a1d1-312f3471960b\" (UID: \"d088452a-eba7-4ada-a1d1-312f3471960b\") " Dec 08 20:15:03 crc kubenswrapper[4781]: I1208 20:15:03.158111 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjtch\" (UniqueName: \"kubernetes.io/projected/d088452a-eba7-4ada-a1d1-312f3471960b-kube-api-access-xjtch\") pod \"d088452a-eba7-4ada-a1d1-312f3471960b\" (UID: \"d088452a-eba7-4ada-a1d1-312f3471960b\") " Dec 08 20:15:03 crc kubenswrapper[4781]: I1208 20:15:03.159388 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d088452a-eba7-4ada-a1d1-312f3471960b-config-volume" (OuterVolumeSpecName: "config-volume") pod "d088452a-eba7-4ada-a1d1-312f3471960b" (UID: "d088452a-eba7-4ada-a1d1-312f3471960b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:15:03 crc kubenswrapper[4781]: I1208 20:15:03.170643 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d088452a-eba7-4ada-a1d1-312f3471960b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d088452a-eba7-4ada-a1d1-312f3471960b" (UID: "d088452a-eba7-4ada-a1d1-312f3471960b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:15:03 crc kubenswrapper[4781]: I1208 20:15:03.170833 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d088452a-eba7-4ada-a1d1-312f3471960b-kube-api-access-xjtch" (OuterVolumeSpecName: "kube-api-access-xjtch") pod "d088452a-eba7-4ada-a1d1-312f3471960b" (UID: "d088452a-eba7-4ada-a1d1-312f3471960b"). InnerVolumeSpecName "kube-api-access-xjtch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:15:03 crc kubenswrapper[4781]: I1208 20:15:03.259492 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d088452a-eba7-4ada-a1d1-312f3471960b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:03 crc kubenswrapper[4781]: I1208 20:15:03.259529 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d088452a-eba7-4ada-a1d1-312f3471960b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:03 crc kubenswrapper[4781]: I1208 20:15:03.259540 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjtch\" (UniqueName: \"kubernetes.io/projected/d088452a-eba7-4ada-a1d1-312f3471960b-kube-api-access-xjtch\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:03 crc kubenswrapper[4781]: I1208 20:15:03.772304 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" event={"ID":"d088452a-eba7-4ada-a1d1-312f3471960b","Type":"ContainerDied","Data":"6e7b876bafd4dfd3289590feb277609550ebd0172bae9640fe58e9bd2691b975"} Dec 08 20:15:03 crc kubenswrapper[4781]: I1208 20:15:03.772388 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e7b876bafd4dfd3289590feb277609550ebd0172bae9640fe58e9bd2691b975" Dec 08 20:15:03 crc kubenswrapper[4781]: I1208 20:15:03.772407 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs" Dec 08 20:15:29 crc kubenswrapper[4781]: I1208 20:15:29.948428 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:15:29 crc kubenswrapper[4781]: I1208 20:15:29.949090 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:15:29 crc kubenswrapper[4781]: I1208 20:15:29.949136 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:15:29 crc kubenswrapper[4781]: I1208 20:15:29.949684 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dfd2ead44684e651bc28c73f98c7b6fe01726a0e85f06c1edad8116b86010483"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 20:15:29 crc kubenswrapper[4781]: I1208 20:15:29.949740 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://dfd2ead44684e651bc28c73f98c7b6fe01726a0e85f06c1edad8116b86010483" gracePeriod=600 Dec 08 20:15:30 crc kubenswrapper[4781]: I1208 20:15:30.919380 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="dfd2ead44684e651bc28c73f98c7b6fe01726a0e85f06c1edad8116b86010483" exitCode=0 Dec 08 20:15:30 crc kubenswrapper[4781]: I1208 20:15:30.919427 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"dfd2ead44684e651bc28c73f98c7b6fe01726a0e85f06c1edad8116b86010483"} Dec 08 20:15:30 crc kubenswrapper[4781]: I1208 20:15:30.919712 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"771a2f271b567bb174dbc7c73044708946a8faab42d018d48f6894347bef1ff3"} Dec 08 20:15:30 crc kubenswrapper[4781]: I1208 20:15:30.919739 4781 scope.go:117] "RemoveContainer" containerID="35d2e280105962572e3cb8a545c5f1fabd05635f164199f39e55da29ab8b26d5" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.089903 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-t8kb7"] Dec 08 20:15:35 crc kubenswrapper[4781]: E1208 20:15:35.090443 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d088452a-eba7-4ada-a1d1-312f3471960b" containerName="collect-profiles" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.090454 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d088452a-eba7-4ada-a1d1-312f3471960b" containerName="collect-profiles" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.090558 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d088452a-eba7-4ada-a1d1-312f3471960b" containerName="collect-profiles" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.090988 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-t8kb7" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.095994 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.096127 4781 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-gmpzq" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.096405 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.099603 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-t8kb7"] Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.110505 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nfw2s"] Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.111293 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-nfw2s" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.113716 4781 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-6wwzz" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.134980 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nfw2s"] Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.140156 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-cpdhc"] Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.141011 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-cpdhc" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.143046 4781 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-bc5dt" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.152645 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-cpdhc"] Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.155868 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddzkc\" (UniqueName: \"kubernetes.io/projected/b83e9dad-39a3-4cce-aaaf-4f3ecc8a9e2d-kube-api-access-ddzkc\") pod \"cert-manager-5b446d88c5-nfw2s\" (UID: \"b83e9dad-39a3-4cce-aaaf-4f3ecc8a9e2d\") " pod="cert-manager/cert-manager-5b446d88c5-nfw2s" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.156116 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxcvr\" (UniqueName: \"kubernetes.io/projected/a8d7f2da-3db7-4daf-afd5-4d3984932e2d-kube-api-access-xxcvr\") pod \"cert-manager-webhook-5655c58dd6-cpdhc\" (UID: \"a8d7f2da-3db7-4daf-afd5-4d3984932e2d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-cpdhc" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.156257 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzbb8\" (UniqueName: \"kubernetes.io/projected/579553a7-31d3-4b32-98fd-03e631e208d4-kube-api-access-pzbb8\") pod \"cert-manager-cainjector-7f985d654d-t8kb7\" (UID: \"579553a7-31d3-4b32-98fd-03e631e208d4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-t8kb7" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.257047 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxcvr\" (UniqueName: \"kubernetes.io/projected/a8d7f2da-3db7-4daf-afd5-4d3984932e2d-kube-api-access-xxcvr\") pod \"cert-manager-webhook-5655c58dd6-cpdhc\" (UID: \"a8d7f2da-3db7-4daf-afd5-4d3984932e2d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-cpdhc" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.257116 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzbb8\" (UniqueName: \"kubernetes.io/projected/579553a7-31d3-4b32-98fd-03e631e208d4-kube-api-access-pzbb8\") pod \"cert-manager-cainjector-7f985d654d-t8kb7\" (UID: \"579553a7-31d3-4b32-98fd-03e631e208d4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-t8kb7" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.257161 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddzkc\" (UniqueName: \"kubernetes.io/projected/b83e9dad-39a3-4cce-aaaf-4f3ecc8a9e2d-kube-api-access-ddzkc\") pod \"cert-manager-5b446d88c5-nfw2s\" (UID: \"b83e9dad-39a3-4cce-aaaf-4f3ecc8a9e2d\") " pod="cert-manager/cert-manager-5b446d88c5-nfw2s" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.276572 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxcvr\" (UniqueName: \"kubernetes.io/projected/a8d7f2da-3db7-4daf-afd5-4d3984932e2d-kube-api-access-xxcvr\") pod \"cert-manager-webhook-5655c58dd6-cpdhc\" (UID: \"a8d7f2da-3db7-4daf-afd5-4d3984932e2d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-cpdhc" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.278190 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzbb8\" (UniqueName: \"kubernetes.io/projected/579553a7-31d3-4b32-98fd-03e631e208d4-kube-api-access-pzbb8\") pod \"cert-manager-cainjector-7f985d654d-t8kb7\" (UID: \"579553a7-31d3-4b32-98fd-03e631e208d4\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-t8kb7" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.278265 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddzkc\" (UniqueName: \"kubernetes.io/projected/b83e9dad-39a3-4cce-aaaf-4f3ecc8a9e2d-kube-api-access-ddzkc\") pod \"cert-manager-5b446d88c5-nfw2s\" (UID: \"b83e9dad-39a3-4cce-aaaf-4f3ecc8a9e2d\") " pod="cert-manager/cert-manager-5b446d88c5-nfw2s" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.415412 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-t8kb7" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.431909 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-nfw2s" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.454572 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-cpdhc" Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.629477 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-t8kb7"] Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.633597 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.672622 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-nfw2s"] Dec 08 20:15:35 crc kubenswrapper[4781]: W1208 20:15:35.674424 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb83e9dad_39a3_4cce_aaaf_4f3ecc8a9e2d.slice/crio-da45fa69064cee8d59638dc9a883ee6ca2d6d239700cca58c14206c11567fbbb WatchSource:0}: Error finding container da45fa69064cee8d59638dc9a883ee6ca2d6d239700cca58c14206c11567fbbb: Status 404 returned error can't find the container with id da45fa69064cee8d59638dc9a883ee6ca2d6d239700cca58c14206c11567fbbb Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.693984 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-cpdhc"] Dec 08 20:15:35 crc kubenswrapper[4781]: W1208 20:15:35.698818 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8d7f2da_3db7_4daf_afd5_4d3984932e2d.slice/crio-50c5f71e000b6e313ef0fe27bbaacb80ce7d2939007f3c23624801f9b4043243 WatchSource:0}: Error finding container 50c5f71e000b6e313ef0fe27bbaacb80ce7d2939007f3c23624801f9b4043243: Status 404 returned error can't find the container with id 50c5f71e000b6e313ef0fe27bbaacb80ce7d2939007f3c23624801f9b4043243 Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.946472 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-cpdhc" event={"ID":"a8d7f2da-3db7-4daf-afd5-4d3984932e2d","Type":"ContainerStarted","Data":"50c5f71e000b6e313ef0fe27bbaacb80ce7d2939007f3c23624801f9b4043243"} Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.947803 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-t8kb7" event={"ID":"579553a7-31d3-4b32-98fd-03e631e208d4","Type":"ContainerStarted","Data":"24974b3413f2f5ef404d5e7fd2d207fcdc8b52ac14db9c4ed5cc6413c6feb358"} Dec 08 20:15:35 crc kubenswrapper[4781]: I1208 20:15:35.950032 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-nfw2s" event={"ID":"b83e9dad-39a3-4cce-aaaf-4f3ecc8a9e2d","Type":"ContainerStarted","Data":"da45fa69064cee8d59638dc9a883ee6ca2d6d239700cca58c14206c11567fbbb"} Dec 08 20:15:38 crc kubenswrapper[4781]: I1208 20:15:38.974648 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-t8kb7" event={"ID":"579553a7-31d3-4b32-98fd-03e631e208d4","Type":"ContainerStarted","Data":"ad5275763037c4741f2a9a47eb78a352709fff11035dc2baafccfd0d27fe56c0"} Dec 08 20:15:38 crc kubenswrapper[4781]: I1208 20:15:38.976164 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-nfw2s" event={"ID":"b83e9dad-39a3-4cce-aaaf-4f3ecc8a9e2d","Type":"ContainerStarted","Data":"e653f4187301021a745e3a15769b9e05ff1426bb614b845d63770d8feef3f6cd"} Dec 08 20:15:38 crc kubenswrapper[4781]: I1208 20:15:38.992071 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-t8kb7" podStartSLOduration=1.707559602 podStartE2EDuration="3.992046406s" podCreationTimestamp="2025-12-08 20:15:35 +0000 UTC" firstStartedPulling="2025-12-08 20:15:35.633320703 +0000 UTC m=+651.784604080" lastFinishedPulling="2025-12-08 20:15:37.917807507 +0000 UTC m=+654.069090884" observedRunningTime="2025-12-08 20:15:38.988678789 +0000 UTC m=+655.139962176" watchObservedRunningTime="2025-12-08 20:15:38.992046406 +0000 UTC m=+655.143329793" Dec 08 20:15:39 crc kubenswrapper[4781]: I1208 20:15:39.009975 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-nfw2s" podStartSLOduration=1.715953881 podStartE2EDuration="4.009946883s" podCreationTimestamp="2025-12-08 20:15:35 +0000 UTC" firstStartedPulling="2025-12-08 20:15:35.675828238 +0000 UTC m=+651.827111615" lastFinishedPulling="2025-12-08 20:15:37.96982124 +0000 UTC m=+654.121104617" observedRunningTime="2025-12-08 20:15:39.00360744 +0000 UTC m=+655.154890827" watchObservedRunningTime="2025-12-08 20:15:39.009946883 +0000 UTC m=+655.161230260" Dec 08 20:15:39 crc kubenswrapper[4781]: I1208 20:15:39.985087 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-cpdhc" event={"ID":"a8d7f2da-3db7-4daf-afd5-4d3984932e2d","Type":"ContainerStarted","Data":"e5574cf4d09dc782347d3431a12f2d300824a21b5178fbb82348a5d01ad74921"} Dec 08 20:15:40 crc kubenswrapper[4781]: I1208 20:15:40.004847 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-cpdhc" podStartSLOduration=1.844838352 podStartE2EDuration="5.004823818s" podCreationTimestamp="2025-12-08 20:15:35 +0000 UTC" firstStartedPulling="2025-12-08 20:15:35.701249211 +0000 UTC m=+651.852532598" lastFinishedPulling="2025-12-08 20:15:38.861234677 +0000 UTC m=+655.012518064" observedRunningTime="2025-12-08 20:15:39.999827564 +0000 UTC m=+656.151110941" watchObservedRunningTime="2025-12-08 20:15:40.004823818 +0000 UTC m=+656.156107215" Dec 08 20:15:40 crc kubenswrapper[4781]: I1208 20:15:40.456149 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-cpdhc" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.376455 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-67t9k"] Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.377359 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovn-controller" containerID="cri-o://9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0" gracePeriod=30 Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.377393 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="nbdb" containerID="cri-o://d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6" gracePeriod=30 Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.377487 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="northd" containerID="cri-o://c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091" gracePeriod=30 Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.377547 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563" gracePeriod=30 Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.377583 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="kube-rbac-proxy-node" containerID="cri-o://5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc" gracePeriod=30 Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.377613 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovn-acl-logging" containerID="cri-o://9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c" gracePeriod=30 Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.377755 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="sbdb" containerID="cri-o://92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4" gracePeriod=30 Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.401320 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" containerID="cri-o://fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44" gracePeriod=30 Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.754863 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/3.log" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.758993 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovn-acl-logging/0.log" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.759965 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovn-controller/0.log" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.760614 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.817827 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-s722j"] Dec 08 20:15:44 crc kubenswrapper[4781]: E1208 20:15:44.818113 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="kube-rbac-proxy-ovn-metrics" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818145 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="kube-rbac-proxy-ovn-metrics" Dec 08 20:15:44 crc kubenswrapper[4781]: E1208 20:15:44.818163 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="northd" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818173 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="northd" Dec 08 20:15:44 crc kubenswrapper[4781]: E1208 20:15:44.818182 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818188 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: E1208 20:15:44.818196 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818201 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: E1208 20:15:44.818209 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="kubecfg-setup" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818216 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="kubecfg-setup" Dec 08 20:15:44 crc kubenswrapper[4781]: E1208 20:15:44.818222 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovn-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818228 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovn-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: E1208 20:15:44.818239 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818246 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: E1208 20:15:44.818253 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="kube-rbac-proxy-node" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818259 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="kube-rbac-proxy-node" Dec 08 20:15:44 crc kubenswrapper[4781]: E1208 20:15:44.818268 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818274 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: E1208 20:15:44.818280 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovn-acl-logging" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818286 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovn-acl-logging" Dec 08 20:15:44 crc kubenswrapper[4781]: E1208 20:15:44.818294 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="nbdb" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818299 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="nbdb" Dec 08 20:15:44 crc kubenswrapper[4781]: E1208 20:15:44.818305 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="sbdb" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818310 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="sbdb" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818401 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818410 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="nbdb" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818418 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818425 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="sbdb" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818432 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovn-acl-logging" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818442 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="northd" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818447 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818453 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818460 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="kube-rbac-proxy-ovn-metrics" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818468 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="kube-rbac-proxy-node" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818478 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovn-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: E1208 20:15:44.818868 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818881 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.818991 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" containerName="ovnkube-controller" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.820385 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879257 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879324 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-slash\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879359 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-cni-bin\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879386 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-ovnkube-script-lib\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879397 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879421 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-ovn\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879454 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-slash" (OuterVolumeSpecName: "host-slash") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879472 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879488 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879524 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-etc-openvswitch\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879547 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-var-lib-openvswitch\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879568 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-run-netns\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879602 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-kubelet\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879643 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-ovnkube-config\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879661 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-systemd-units\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879693 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-cni-netd\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879723 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-openvswitch\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879749 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-node-log\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879769 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-systemd\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879791 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-run-ovn-kubernetes\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879818 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3526d83-eb7e-486e-9357-80df536d09fd-ovn-node-metrics-cert\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879837 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-log-socket\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879864 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dbq4\" (UniqueName: \"kubernetes.io/projected/a3526d83-eb7e-486e-9357-80df536d09fd-kube-api-access-9dbq4\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.879895 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-env-overrides\") pod \"a3526d83-eb7e-486e-9357-80df536d09fd\" (UID: \"a3526d83-eb7e-486e-9357-80df536d09fd\") " Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880072 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880219 4781 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880235 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880260 4781 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880271 4781 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-slash\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880285 4781 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880317 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880345 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880369 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880394 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880417 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880460 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880529 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880558 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-node-log" (OuterVolumeSpecName: "node-log") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880763 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.880799 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.881094 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-log-socket" (OuterVolumeSpecName: "log-socket") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.881147 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.885448 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3526d83-eb7e-486e-9357-80df536d09fd-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.886215 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3526d83-eb7e-486e-9357-80df536d09fd-kube-api-access-9dbq4" (OuterVolumeSpecName: "kube-api-access-9dbq4") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "kube-api-access-9dbq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.894576 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a3526d83-eb7e-486e-9357-80df536d09fd" (UID: "a3526d83-eb7e-486e-9357-80df536d09fd"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.980969 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-cni-netd\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981057 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nj9f\" (UniqueName: \"kubernetes.io/projected/03ca4926-ea41-41fb-9f51-7d0edfefae81-kube-api-access-6nj9f\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981119 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-run-systemd\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981142 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-cni-bin\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981176 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-run-ovn\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981200 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-log-socket\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981223 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-systemd-units\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981248 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/03ca4926-ea41-41fb-9f51-7d0edfefae81-ovn-node-metrics-cert\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981334 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-var-lib-openvswitch\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981403 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-run-ovn-kubernetes\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981436 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-run-netns\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981468 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/03ca4926-ea41-41fb-9f51-7d0edfefae81-ovnkube-script-lib\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981497 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-run-openvswitch\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981539 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/03ca4926-ea41-41fb-9f51-7d0edfefae81-ovnkube-config\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981580 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-kubelet\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981610 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/03ca4926-ea41-41fb-9f51-7d0edfefae81-env-overrides\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981629 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-slash\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981658 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-etc-openvswitch\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981691 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-node-log\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981707 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981758 4781 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981772 4781 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981784 4781 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981795 4781 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981806 4781 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981824 4781 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981838 4781 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981850 4781 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981860 4781 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-node-log\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981871 4781 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981955 4781 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.981994 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3526d83-eb7e-486e-9357-80df536d09fd-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.982005 4781 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3526d83-eb7e-486e-9357-80df536d09fd-log-socket\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.982016 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dbq4\" (UniqueName: \"kubernetes.io/projected/a3526d83-eb7e-486e-9357-80df536d09fd-kube-api-access-9dbq4\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:44 crc kubenswrapper[4781]: I1208 20:15:44.982026 4781 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3526d83-eb7e-486e-9357-80df536d09fd-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.017779 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovnkube-controller/3.log" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.020571 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovn-acl-logging/0.log" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.021349 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-67t9k_a3526d83-eb7e-486e-9357-80df536d09fd/ovn-controller/0.log" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.021874 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3526d83-eb7e-486e-9357-80df536d09fd" containerID="fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44" exitCode=0 Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.021910 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3526d83-eb7e-486e-9357-80df536d09fd" containerID="92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4" exitCode=0 Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.021930 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3526d83-eb7e-486e-9357-80df536d09fd" containerID="d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6" exitCode=0 Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.021937 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3526d83-eb7e-486e-9357-80df536d09fd" containerID="c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091" exitCode=0 Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.021944 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3526d83-eb7e-486e-9357-80df536d09fd" containerID="0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563" exitCode=0 Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.021950 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3526d83-eb7e-486e-9357-80df536d09fd" containerID="5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc" exitCode=0 Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.021960 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3526d83-eb7e-486e-9357-80df536d09fd" containerID="9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c" exitCode=143 Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.021968 4781 generic.go:334] "Generic (PLEG): container finished" podID="a3526d83-eb7e-486e-9357-80df536d09fd" containerID="9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0" exitCode=143 Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022047 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022076 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022084 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022210 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022230 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022242 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022252 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022264 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022274 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022280 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022286 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022292 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022297 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022302 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022309 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022314 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022323 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022333 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022341 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022348 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022354 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022361 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022369 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022375 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022382 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022387 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022392 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022096 4781 scope.go:117] "RemoveContainer" containerID="fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022399 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022410 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022416 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022422 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022426 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022431 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022436 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022441 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022446 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022452 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022457 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022464 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-67t9k" event={"ID":"a3526d83-eb7e-486e-9357-80df536d09fd","Type":"ContainerDied","Data":"5ba994fef7e3867eb1a9ffc1b0a942969dc4cc1f270552ea31887c53bdd46d2a"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022472 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022478 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022483 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022488 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022493 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022498 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022502 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022509 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022515 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.022520 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.024159 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tm5z7_a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20/kube-multus/2.log" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.024657 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tm5z7_a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20/kube-multus/1.log" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.024694 4781 generic.go:334] "Generic (PLEG): container finished" podID="a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20" containerID="aa1a90ca80410aee290e00aa4bec32980f22fe04e8d7dcac754b42a8fa098950" exitCode=2 Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.024719 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tm5z7" event={"ID":"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20","Type":"ContainerDied","Data":"aa1a90ca80410aee290e00aa4bec32980f22fe04e8d7dcac754b42a8fa098950"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.024739 4781 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a19a77a66bede932c0b113bf2257987ba53c17493c3db9eee6b2869f005dd4a"} Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.025082 4781 scope.go:117] "RemoveContainer" containerID="aa1a90ca80410aee290e00aa4bec32980f22fe04e8d7dcac754b42a8fa098950" Dec 08 20:15:45 crc kubenswrapper[4781]: E1208 20:15:45.025298 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-tm5z7_openshift-multus(a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20)\"" pod="openshift-multus/multus-tm5z7" podUID="a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.050685 4781 scope.go:117] "RemoveContainer" containerID="d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.066191 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-67t9k"] Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.070770 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-67t9k"] Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083152 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/03ca4926-ea41-41fb-9f51-7d0edfefae81-ovn-node-metrics-cert\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083281 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-var-lib-openvswitch\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083328 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-run-ovn-kubernetes\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083364 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-var-lib-openvswitch\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083383 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-run-netns\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083407 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-run-ovn-kubernetes\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083412 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/03ca4926-ea41-41fb-9f51-7d0edfefae81-ovnkube-script-lib\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083469 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-run-openvswitch\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083500 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/03ca4926-ea41-41fb-9f51-7d0edfefae81-ovnkube-config\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083562 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-kubelet\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083610 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/03ca4926-ea41-41fb-9f51-7d0edfefae81-env-overrides\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083639 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-slash\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083689 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-etc-openvswitch\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083719 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-node-log\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083740 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083968 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-cni-netd\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083982 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-node-log\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083834 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-run-openvswitch\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083841 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-slash\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083992 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nj9f\" (UniqueName: \"kubernetes.io/projected/03ca4926-ea41-41fb-9f51-7d0edfefae81-kube-api-access-6nj9f\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084050 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-etc-openvswitch\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084090 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-run-systemd\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083810 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-kubelet\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084064 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-run-systemd\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084144 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-cni-bin\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084203 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-cni-bin\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084208 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/03ca4926-ea41-41fb-9f51-7d0edfefae81-ovnkube-script-lib\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084172 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084202 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/03ca4926-ea41-41fb-9f51-7d0edfefae81-env-overrides\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084140 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-cni-netd\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084210 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-run-ovn\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084305 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-log-socket\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-run-ovn\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.083429 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-host-run-netns\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084338 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-log-socket\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084339 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-systemd-units\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084532 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/03ca4926-ea41-41fb-9f51-7d0edfefae81-systemd-units\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084822 4781 scope.go:117] "RemoveContainer" containerID="92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.084901 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/03ca4926-ea41-41fb-9f51-7d0edfefae81-ovnkube-config\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.088104 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/03ca4926-ea41-41fb-9f51-7d0edfefae81-ovn-node-metrics-cert\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.097968 4781 scope.go:117] "RemoveContainer" containerID="d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.101703 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nj9f\" (UniqueName: \"kubernetes.io/projected/03ca4926-ea41-41fb-9f51-7d0edfefae81-kube-api-access-6nj9f\") pod \"ovnkube-node-s722j\" (UID: \"03ca4926-ea41-41fb-9f51-7d0edfefae81\") " pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.111755 4781 scope.go:117] "RemoveContainer" containerID="c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.126446 4781 scope.go:117] "RemoveContainer" containerID="0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.134150 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.138043 4781 scope.go:117] "RemoveContainer" containerID="5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.153733 4781 scope.go:117] "RemoveContainer" containerID="9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c" Dec 08 20:15:45 crc kubenswrapper[4781]: W1208 20:15:45.160266 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03ca4926_ea41_41fb_9f51_7d0edfefae81.slice/crio-a6b978872b929d80de2d4b0867a104d78827a7c88ab26ba4025b11c3060ae8f5 WatchSource:0}: Error finding container a6b978872b929d80de2d4b0867a104d78827a7c88ab26ba4025b11c3060ae8f5: Status 404 returned error can't find the container with id a6b978872b929d80de2d4b0867a104d78827a7c88ab26ba4025b11c3060ae8f5 Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.166677 4781 scope.go:117] "RemoveContainer" containerID="9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.183140 4781 scope.go:117] "RemoveContainer" containerID="9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.199278 4781 scope.go:117] "RemoveContainer" containerID="fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44" Dec 08 20:15:45 crc kubenswrapper[4781]: E1208 20:15:45.199839 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44\": container with ID starting with fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44 not found: ID does not exist" containerID="fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.199893 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44"} err="failed to get container status \"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44\": rpc error: code = NotFound desc = could not find container \"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44\": container with ID starting with fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.199959 4781 scope.go:117] "RemoveContainer" containerID="d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2" Dec 08 20:15:45 crc kubenswrapper[4781]: E1208 20:15:45.200306 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\": container with ID starting with d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2 not found: ID does not exist" containerID="d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.200334 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2"} err="failed to get container status \"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\": rpc error: code = NotFound desc = could not find container \"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\": container with ID starting with d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.200352 4781 scope.go:117] "RemoveContainer" containerID="92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4" Dec 08 20:15:45 crc kubenswrapper[4781]: E1208 20:15:45.200651 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\": container with ID starting with 92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4 not found: ID does not exist" containerID="92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.200675 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4"} err="failed to get container status \"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\": rpc error: code = NotFound desc = could not find container \"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\": container with ID starting with 92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.200688 4781 scope.go:117] "RemoveContainer" containerID="d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6" Dec 08 20:15:45 crc kubenswrapper[4781]: E1208 20:15:45.200891 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\": container with ID starting with d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6 not found: ID does not exist" containerID="d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.200925 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6"} err="failed to get container status \"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\": rpc error: code = NotFound desc = could not find container \"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\": container with ID starting with d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.200940 4781 scope.go:117] "RemoveContainer" containerID="c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091" Dec 08 20:15:45 crc kubenswrapper[4781]: E1208 20:15:45.201240 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\": container with ID starting with c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091 not found: ID does not exist" containerID="c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.201262 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091"} err="failed to get container status \"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\": rpc error: code = NotFound desc = could not find container \"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\": container with ID starting with c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.201274 4781 scope.go:117] "RemoveContainer" containerID="0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563" Dec 08 20:15:45 crc kubenswrapper[4781]: E1208 20:15:45.201558 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\": container with ID starting with 0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563 not found: ID does not exist" containerID="0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.201600 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563"} err="failed to get container status \"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\": rpc error: code = NotFound desc = could not find container \"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\": container with ID starting with 0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.201627 4781 scope.go:117] "RemoveContainer" containerID="5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc" Dec 08 20:15:45 crc kubenswrapper[4781]: E1208 20:15:45.201979 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\": container with ID starting with 5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc not found: ID does not exist" containerID="5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.202006 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc"} err="failed to get container status \"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\": rpc error: code = NotFound desc = could not find container \"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\": container with ID starting with 5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.202020 4781 scope.go:117] "RemoveContainer" containerID="9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c" Dec 08 20:15:45 crc kubenswrapper[4781]: E1208 20:15:45.202269 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\": container with ID starting with 9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c not found: ID does not exist" containerID="9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.202289 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c"} err="failed to get container status \"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\": rpc error: code = NotFound desc = could not find container \"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\": container with ID starting with 9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.202303 4781 scope.go:117] "RemoveContainer" containerID="9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0" Dec 08 20:15:45 crc kubenswrapper[4781]: E1208 20:15:45.202670 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\": container with ID starting with 9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0 not found: ID does not exist" containerID="9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.202698 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0"} err="failed to get container status \"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\": rpc error: code = NotFound desc = could not find container \"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\": container with ID starting with 9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.202718 4781 scope.go:117] "RemoveContainer" containerID="9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f" Dec 08 20:15:45 crc kubenswrapper[4781]: E1208 20:15:45.203078 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\": container with ID starting with 9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f not found: ID does not exist" containerID="9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.203093 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f"} err="failed to get container status \"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\": rpc error: code = NotFound desc = could not find container \"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\": container with ID starting with 9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.203104 4781 scope.go:117] "RemoveContainer" containerID="fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.203373 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44"} err="failed to get container status \"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44\": rpc error: code = NotFound desc = could not find container \"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44\": container with ID starting with fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.203393 4781 scope.go:117] "RemoveContainer" containerID="d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.203643 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2"} err="failed to get container status \"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\": rpc error: code = NotFound desc = could not find container \"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\": container with ID starting with d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.203669 4781 scope.go:117] "RemoveContainer" containerID="92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.203941 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4"} err="failed to get container status \"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\": rpc error: code = NotFound desc = could not find container \"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\": container with ID starting with 92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.203967 4781 scope.go:117] "RemoveContainer" containerID="d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.204301 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6"} err="failed to get container status \"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\": rpc error: code = NotFound desc = could not find container \"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\": container with ID starting with d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.204330 4781 scope.go:117] "RemoveContainer" containerID="c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.204617 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091"} err="failed to get container status \"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\": rpc error: code = NotFound desc = could not find container \"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\": container with ID starting with c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.204641 4781 scope.go:117] "RemoveContainer" containerID="0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.204884 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563"} err="failed to get container status \"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\": rpc error: code = NotFound desc = could not find container \"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\": container with ID starting with 0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.204906 4781 scope.go:117] "RemoveContainer" containerID="5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.205157 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc"} err="failed to get container status \"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\": rpc error: code = NotFound desc = could not find container \"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\": container with ID starting with 5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.205198 4781 scope.go:117] "RemoveContainer" containerID="9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.205487 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c"} err="failed to get container status \"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\": rpc error: code = NotFound desc = could not find container \"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\": container with ID starting with 9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.205513 4781 scope.go:117] "RemoveContainer" containerID="9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.205737 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0"} err="failed to get container status \"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\": rpc error: code = NotFound desc = could not find container \"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\": container with ID starting with 9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.205756 4781 scope.go:117] "RemoveContainer" containerID="9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.205952 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f"} err="failed to get container status \"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\": rpc error: code = NotFound desc = could not find container \"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\": container with ID starting with 9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.205971 4781 scope.go:117] "RemoveContainer" containerID="fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.206172 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44"} err="failed to get container status \"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44\": rpc error: code = NotFound desc = could not find container \"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44\": container with ID starting with fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.206189 4781 scope.go:117] "RemoveContainer" containerID="d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.206495 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2"} err="failed to get container status \"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\": rpc error: code = NotFound desc = could not find container \"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\": container with ID starting with d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.206513 4781 scope.go:117] "RemoveContainer" containerID="92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.206709 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4"} err="failed to get container status \"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\": rpc error: code = NotFound desc = could not find container \"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\": container with ID starting with 92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.206725 4781 scope.go:117] "RemoveContainer" containerID="d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.207098 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6"} err="failed to get container status \"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\": rpc error: code = NotFound desc = could not find container \"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\": container with ID starting with d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.207117 4781 scope.go:117] "RemoveContainer" containerID="c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.207346 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091"} err="failed to get container status \"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\": rpc error: code = NotFound desc = could not find container \"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\": container with ID starting with c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.207368 4781 scope.go:117] "RemoveContainer" containerID="0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.207635 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563"} err="failed to get container status \"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\": rpc error: code = NotFound desc = could not find container \"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\": container with ID starting with 0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.207654 4781 scope.go:117] "RemoveContainer" containerID="5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.207965 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc"} err="failed to get container status \"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\": rpc error: code = NotFound desc = could not find container \"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\": container with ID starting with 5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.207986 4781 scope.go:117] "RemoveContainer" containerID="9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.208604 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c"} err="failed to get container status \"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\": rpc error: code = NotFound desc = could not find container \"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\": container with ID starting with 9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.208624 4781 scope.go:117] "RemoveContainer" containerID="9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.208970 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0"} err="failed to get container status \"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\": rpc error: code = NotFound desc = could not find container \"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\": container with ID starting with 9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.209007 4781 scope.go:117] "RemoveContainer" containerID="9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.209344 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f"} err="failed to get container status \"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\": rpc error: code = NotFound desc = could not find container \"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\": container with ID starting with 9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.209424 4781 scope.go:117] "RemoveContainer" containerID="fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.209733 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44"} err="failed to get container status \"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44\": rpc error: code = NotFound desc = could not find container \"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44\": container with ID starting with fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.209754 4781 scope.go:117] "RemoveContainer" containerID="d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.210183 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2"} err="failed to get container status \"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\": rpc error: code = NotFound desc = could not find container \"d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2\": container with ID starting with d1f0c8d6f19faa33f7205ff9b09c97bc9316f0dab10857b5385131b2a75829b2 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.210205 4781 scope.go:117] "RemoveContainer" containerID="92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.210409 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4"} err="failed to get container status \"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\": rpc error: code = NotFound desc = could not find container \"92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4\": container with ID starting with 92625e21ea2b5bc6c1049bfa1ec9f4b3cc0e41579768140b4392d1f0375561e4 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.210428 4781 scope.go:117] "RemoveContainer" containerID="d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.210619 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6"} err="failed to get container status \"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\": rpc error: code = NotFound desc = could not find container \"d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6\": container with ID starting with d274520810293c9aeab7da15c36f2a0af3023587c90b99ba5c35a9e3ed5f77e6 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.210636 4781 scope.go:117] "RemoveContainer" containerID="c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.210857 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091"} err="failed to get container status \"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\": rpc error: code = NotFound desc = could not find container \"c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091\": container with ID starting with c3bbfe98c83b9192307a4b0f5ef5b13e55ef4be5ccac0fb2c1c7756e051ab091 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.210880 4781 scope.go:117] "RemoveContainer" containerID="0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.211173 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563"} err="failed to get container status \"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\": rpc error: code = NotFound desc = could not find container \"0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563\": container with ID starting with 0685fece7aa968958a241143e1b7b5cc5c82e0dd81af80e3cf63cb399d2ef563 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.211193 4781 scope.go:117] "RemoveContainer" containerID="5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.211477 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc"} err="failed to get container status \"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\": rpc error: code = NotFound desc = could not find container \"5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc\": container with ID starting with 5ad060f41a80effebaffce5d1e32766394f87435e6436f276aa892db744a1acc not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.211496 4781 scope.go:117] "RemoveContainer" containerID="9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.211766 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c"} err="failed to get container status \"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\": rpc error: code = NotFound desc = could not find container \"9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c\": container with ID starting with 9b40c3150e541356ff7c4b7db7ddc72326961b434e6d97ab424fd1f504cae80c not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.211789 4781 scope.go:117] "RemoveContainer" containerID="9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.212057 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0"} err="failed to get container status \"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\": rpc error: code = NotFound desc = could not find container \"9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0\": container with ID starting with 9830d1fe2f3d2e950f95d063acf9edf718997ae38ba9a0d0b8c05c5903d5b2b0 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.212078 4781 scope.go:117] "RemoveContainer" containerID="9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.212354 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f"} err="failed to get container status \"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\": rpc error: code = NotFound desc = could not find container \"9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f\": container with ID starting with 9149828d1e64a2c5087698392563f31fdc33d6096d4cdf61f110fc3d8518ac6f not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.212377 4781 scope.go:117] "RemoveContainer" containerID="fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.212588 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44"} err="failed to get container status \"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44\": rpc error: code = NotFound desc = could not find container \"fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44\": container with ID starting with fa7a0e20e3704638c056aa011512206a93f8b39e9bc6039a1a897204a0195b44 not found: ID does not exist" Dec 08 20:15:45 crc kubenswrapper[4781]: I1208 20:15:45.458856 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-cpdhc" Dec 08 20:15:46 crc kubenswrapper[4781]: I1208 20:15:46.032318 4781 generic.go:334] "Generic (PLEG): container finished" podID="03ca4926-ea41-41fb-9f51-7d0edfefae81" containerID="7cef8b7045f86c9af64a88397bc0faa1937f480196888ade971171ef45e4ae26" exitCode=0 Dec 08 20:15:46 crc kubenswrapper[4781]: I1208 20:15:46.033278 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" event={"ID":"03ca4926-ea41-41fb-9f51-7d0edfefae81","Type":"ContainerDied","Data":"7cef8b7045f86c9af64a88397bc0faa1937f480196888ade971171ef45e4ae26"} Dec 08 20:15:46 crc kubenswrapper[4781]: I1208 20:15:46.033449 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" event={"ID":"03ca4926-ea41-41fb-9f51-7d0edfefae81","Type":"ContainerStarted","Data":"a6b978872b929d80de2d4b0867a104d78827a7c88ab26ba4025b11c3060ae8f5"} Dec 08 20:15:46 crc kubenswrapper[4781]: I1208 20:15:46.132938 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3526d83-eb7e-486e-9357-80df536d09fd" path="/var/lib/kubelet/pods/a3526d83-eb7e-486e-9357-80df536d09fd/volumes" Dec 08 20:15:47 crc kubenswrapper[4781]: I1208 20:15:47.043514 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" event={"ID":"03ca4926-ea41-41fb-9f51-7d0edfefae81","Type":"ContainerStarted","Data":"ef445d07c6c4666e87d7ede47132a8ee8ef51438776f34fd1ed67f6d13bff869"} Dec 08 20:15:47 crc kubenswrapper[4781]: I1208 20:15:47.043852 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" event={"ID":"03ca4926-ea41-41fb-9f51-7d0edfefae81","Type":"ContainerStarted","Data":"5ec66b5da2a30ec1ad38bfba07afe1f64860b214b6d8584d4d5988b2697c176a"} Dec 08 20:15:47 crc kubenswrapper[4781]: I1208 20:15:47.043864 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" event={"ID":"03ca4926-ea41-41fb-9f51-7d0edfefae81","Type":"ContainerStarted","Data":"2560f10c2a1a4dea55d4d3f4054ef6d9fbc86174882806c9a6d070547e6bc3e7"} Dec 08 20:15:47 crc kubenswrapper[4781]: I1208 20:15:47.043872 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" event={"ID":"03ca4926-ea41-41fb-9f51-7d0edfefae81","Type":"ContainerStarted","Data":"2befdddae37dbc0a42b408a7aad4545fa2f9b0db0a4751ef8dbbecd3498f0744"} Dec 08 20:15:47 crc kubenswrapper[4781]: I1208 20:15:47.043884 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" event={"ID":"03ca4926-ea41-41fb-9f51-7d0edfefae81","Type":"ContainerStarted","Data":"a48de3882f6b6d07d89ad4c8411d48ce07aa589f5c8d9dd04141c3d56f5ac492"} Dec 08 20:15:47 crc kubenswrapper[4781]: I1208 20:15:47.043893 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" event={"ID":"03ca4926-ea41-41fb-9f51-7d0edfefae81","Type":"ContainerStarted","Data":"28bb81ef5509bc1c9decedc757c7ea3c066c60e07d2b0be4ae7f168a69df31dd"} Dec 08 20:15:49 crc kubenswrapper[4781]: I1208 20:15:49.070987 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" event={"ID":"03ca4926-ea41-41fb-9f51-7d0edfefae81","Type":"ContainerStarted","Data":"72fd95663db36aababa8b570fa81e8bd55e49d3861fc1111e2f647433ee6dfd2"} Dec 08 20:15:52 crc kubenswrapper[4781]: I1208 20:15:52.092246 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" event={"ID":"03ca4926-ea41-41fb-9f51-7d0edfefae81","Type":"ContainerStarted","Data":"050a1935b3b9ed713da1c5c91af5eb599c3d988452744ad7f1dfa360834daa21"} Dec 08 20:15:52 crc kubenswrapper[4781]: I1208 20:15:52.092862 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:52 crc kubenswrapper[4781]: I1208 20:15:52.092891 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:52 crc kubenswrapper[4781]: I1208 20:15:52.092903 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:52 crc kubenswrapper[4781]: I1208 20:15:52.117413 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:52 crc kubenswrapper[4781]: I1208 20:15:52.119428 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:15:52 crc kubenswrapper[4781]: I1208 20:15:52.122622 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" podStartSLOduration=8.122602842 podStartE2EDuration="8.122602842s" podCreationTimestamp="2025-12-08 20:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:15:52.116555867 +0000 UTC m=+668.267839244" watchObservedRunningTime="2025-12-08 20:15:52.122602842 +0000 UTC m=+668.273886219" Dec 08 20:15:57 crc kubenswrapper[4781]: I1208 20:15:57.126269 4781 scope.go:117] "RemoveContainer" containerID="aa1a90ca80410aee290e00aa4bec32980f22fe04e8d7dcac754b42a8fa098950" Dec 08 20:15:57 crc kubenswrapper[4781]: E1208 20:15:57.127031 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-tm5z7_openshift-multus(a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20)\"" pod="openshift-multus/multus-tm5z7" podUID="a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20" Dec 08 20:16:08 crc kubenswrapper[4781]: I1208 20:16:08.126223 4781 scope.go:117] "RemoveContainer" containerID="aa1a90ca80410aee290e00aa4bec32980f22fe04e8d7dcac754b42a8fa098950" Dec 08 20:16:09 crc kubenswrapper[4781]: I1208 20:16:09.191448 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tm5z7_a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20/kube-multus/2.log" Dec 08 20:16:09 crc kubenswrapper[4781]: I1208 20:16:09.192700 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tm5z7_a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20/kube-multus/1.log" Dec 08 20:16:09 crc kubenswrapper[4781]: I1208 20:16:09.192795 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tm5z7" event={"ID":"a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20","Type":"ContainerStarted","Data":"f38acf553a025008513df0edf6a675d4c627cfab9823d4222f343d90d9fdb0ac"} Dec 08 20:16:15 crc kubenswrapper[4781]: I1208 20:16:15.155362 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s722j" Dec 08 20:16:24 crc kubenswrapper[4781]: I1208 20:16:24.960332 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9"] Dec 08 20:16:24 crc kubenswrapper[4781]: I1208 20:16:24.961887 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" Dec 08 20:16:24 crc kubenswrapper[4781]: I1208 20:16:24.963594 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 08 20:16:24 crc kubenswrapper[4781]: I1208 20:16:24.973126 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9"] Dec 08 20:16:25 crc kubenswrapper[4781]: I1208 20:16:25.091141 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjq7d\" (UniqueName: \"kubernetes.io/projected/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-kube-api-access-gjq7d\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9\" (UID: \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" Dec 08 20:16:25 crc kubenswrapper[4781]: I1208 20:16:25.091353 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9\" (UID: \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" Dec 08 20:16:25 crc kubenswrapper[4781]: I1208 20:16:25.091458 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9\" (UID: \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" Dec 08 20:16:25 crc kubenswrapper[4781]: I1208 20:16:25.192587 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjq7d\" (UniqueName: \"kubernetes.io/projected/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-kube-api-access-gjq7d\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9\" (UID: \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" Dec 08 20:16:25 crc kubenswrapper[4781]: I1208 20:16:25.192674 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9\" (UID: \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" Dec 08 20:16:25 crc kubenswrapper[4781]: I1208 20:16:25.192715 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9\" (UID: \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" Dec 08 20:16:25 crc kubenswrapper[4781]: I1208 20:16:25.193234 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9\" (UID: \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" Dec 08 20:16:25 crc kubenswrapper[4781]: I1208 20:16:25.193249 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9\" (UID: \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" Dec 08 20:16:25 crc kubenswrapper[4781]: I1208 20:16:25.213979 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjq7d\" (UniqueName: \"kubernetes.io/projected/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-kube-api-access-gjq7d\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9\" (UID: \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" Dec 08 20:16:25 crc kubenswrapper[4781]: I1208 20:16:25.277880 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" Dec 08 20:16:25 crc kubenswrapper[4781]: I1208 20:16:25.461158 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9"] Dec 08 20:16:25 crc kubenswrapper[4781]: W1208 20:16:25.469524 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7006c3a5_ac34_49f3_81f9_25a51e4a5e9f.slice/crio-1b961ea556ae5b435ee9dd1a8c587926da832f56adba9ac9af0a2ca3c354988a WatchSource:0}: Error finding container 1b961ea556ae5b435ee9dd1a8c587926da832f56adba9ac9af0a2ca3c354988a: Status 404 returned error can't find the container with id 1b961ea556ae5b435ee9dd1a8c587926da832f56adba9ac9af0a2ca3c354988a Dec 08 20:16:26 crc kubenswrapper[4781]: I1208 20:16:26.280212 4781 generic.go:334] "Generic (PLEG): container finished" podID="7006c3a5-ac34-49f3-81f9-25a51e4a5e9f" containerID="d310ad24c71f43cf4f1de6fea5e817de8d5373bd4e610af35b17673e6a2d4437" exitCode=0 Dec 08 20:16:26 crc kubenswrapper[4781]: I1208 20:16:26.280259 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" event={"ID":"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f","Type":"ContainerDied","Data":"d310ad24c71f43cf4f1de6fea5e817de8d5373bd4e610af35b17673e6a2d4437"} Dec 08 20:16:26 crc kubenswrapper[4781]: I1208 20:16:26.280287 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" event={"ID":"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f","Type":"ContainerStarted","Data":"1b961ea556ae5b435ee9dd1a8c587926da832f56adba9ac9af0a2ca3c354988a"} Dec 08 20:16:28 crc kubenswrapper[4781]: I1208 20:16:28.294277 4781 generic.go:334] "Generic (PLEG): container finished" podID="7006c3a5-ac34-49f3-81f9-25a51e4a5e9f" containerID="80b4758e35aefc59cb1d323da496888da2dfe1c2dd359a1021ae68940c447fc7" exitCode=0 Dec 08 20:16:28 crc kubenswrapper[4781]: I1208 20:16:28.294348 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" event={"ID":"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f","Type":"ContainerDied","Data":"80b4758e35aefc59cb1d323da496888da2dfe1c2dd359a1021ae68940c447fc7"} Dec 08 20:16:29 crc kubenswrapper[4781]: I1208 20:16:29.301551 4781 generic.go:334] "Generic (PLEG): container finished" podID="7006c3a5-ac34-49f3-81f9-25a51e4a5e9f" containerID="8d1764b39bedeead406dcfee190e554cf936c8eb6ab47f07840f5e93b21c543c" exitCode=0 Dec 08 20:16:29 crc kubenswrapper[4781]: I1208 20:16:29.301615 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" event={"ID":"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f","Type":"ContainerDied","Data":"8d1764b39bedeead406dcfee190e554cf936c8eb6ab47f07840f5e93b21c543c"} Dec 08 20:16:30 crc kubenswrapper[4781]: I1208 20:16:30.547429 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" Dec 08 20:16:30 crc kubenswrapper[4781]: I1208 20:16:30.657876 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjq7d\" (UniqueName: \"kubernetes.io/projected/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-kube-api-access-gjq7d\") pod \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\" (UID: \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\") " Dec 08 20:16:30 crc kubenswrapper[4781]: I1208 20:16:30.657939 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-util\") pod \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\" (UID: \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\") " Dec 08 20:16:30 crc kubenswrapper[4781]: I1208 20:16:30.658069 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-bundle\") pod \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\" (UID: \"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f\") " Dec 08 20:16:30 crc kubenswrapper[4781]: I1208 20:16:30.658901 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-bundle" (OuterVolumeSpecName: "bundle") pod "7006c3a5-ac34-49f3-81f9-25a51e4a5e9f" (UID: "7006c3a5-ac34-49f3-81f9-25a51e4a5e9f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:16:30 crc kubenswrapper[4781]: I1208 20:16:30.672344 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-util" (OuterVolumeSpecName: "util") pod "7006c3a5-ac34-49f3-81f9-25a51e4a5e9f" (UID: "7006c3a5-ac34-49f3-81f9-25a51e4a5e9f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:16:30 crc kubenswrapper[4781]: I1208 20:16:30.672506 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-kube-api-access-gjq7d" (OuterVolumeSpecName: "kube-api-access-gjq7d") pod "7006c3a5-ac34-49f3-81f9-25a51e4a5e9f" (UID: "7006c3a5-ac34-49f3-81f9-25a51e4a5e9f"). InnerVolumeSpecName "kube-api-access-gjq7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:16:30 crc kubenswrapper[4781]: I1208 20:16:30.759125 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:16:30 crc kubenswrapper[4781]: I1208 20:16:30.759159 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjq7d\" (UniqueName: \"kubernetes.io/projected/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-kube-api-access-gjq7d\") on node \"crc\" DevicePath \"\"" Dec 08 20:16:30 crc kubenswrapper[4781]: I1208 20:16:30.759170 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7006c3a5-ac34-49f3-81f9-25a51e4a5e9f-util\") on node \"crc\" DevicePath \"\"" Dec 08 20:16:31 crc kubenswrapper[4781]: I1208 20:16:31.316391 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" event={"ID":"7006c3a5-ac34-49f3-81f9-25a51e4a5e9f","Type":"ContainerDied","Data":"1b961ea556ae5b435ee9dd1a8c587926da832f56adba9ac9af0a2ca3c354988a"} Dec 08 20:16:31 crc kubenswrapper[4781]: I1208 20:16:31.316426 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b961ea556ae5b435ee9dd1a8c587926da832f56adba9ac9af0a2ca3c354988a" Dec 08 20:16:31 crc kubenswrapper[4781]: I1208 20:16:31.316436 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9" Dec 08 20:16:33 crc kubenswrapper[4781]: I1208 20:16:33.921965 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-66p75"] Dec 08 20:16:33 crc kubenswrapper[4781]: E1208 20:16:33.922498 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7006c3a5-ac34-49f3-81f9-25a51e4a5e9f" containerName="extract" Dec 08 20:16:33 crc kubenswrapper[4781]: I1208 20:16:33.922513 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7006c3a5-ac34-49f3-81f9-25a51e4a5e9f" containerName="extract" Dec 08 20:16:33 crc kubenswrapper[4781]: E1208 20:16:33.922535 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7006c3a5-ac34-49f3-81f9-25a51e4a5e9f" containerName="util" Dec 08 20:16:33 crc kubenswrapper[4781]: I1208 20:16:33.922544 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7006c3a5-ac34-49f3-81f9-25a51e4a5e9f" containerName="util" Dec 08 20:16:33 crc kubenswrapper[4781]: E1208 20:16:33.922562 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7006c3a5-ac34-49f3-81f9-25a51e4a5e9f" containerName="pull" Dec 08 20:16:33 crc kubenswrapper[4781]: I1208 20:16:33.922571 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7006c3a5-ac34-49f3-81f9-25a51e4a5e9f" containerName="pull" Dec 08 20:16:33 crc kubenswrapper[4781]: I1208 20:16:33.922679 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7006c3a5-ac34-49f3-81f9-25a51e4a5e9f" containerName="extract" Dec 08 20:16:33 crc kubenswrapper[4781]: I1208 20:16:33.923121 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-66p75" Dec 08 20:16:33 crc kubenswrapper[4781]: I1208 20:16:33.925593 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 08 20:16:33 crc kubenswrapper[4781]: I1208 20:16:33.925633 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 08 20:16:33 crc kubenswrapper[4781]: I1208 20:16:33.926023 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-x7tg5" Dec 08 20:16:33 crc kubenswrapper[4781]: I1208 20:16:33.938596 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-66p75"] Dec 08 20:16:33 crc kubenswrapper[4781]: I1208 20:16:33.999637 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zppkl\" (UniqueName: \"kubernetes.io/projected/0e3edf9d-cddf-4b23-bda2-930fb5cbaf27-kube-api-access-zppkl\") pod \"nmstate-operator-5b5b58f5c8-66p75\" (UID: \"0e3edf9d-cddf-4b23-bda2-930fb5cbaf27\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-66p75" Dec 08 20:16:34 crc kubenswrapper[4781]: I1208 20:16:34.100506 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zppkl\" (UniqueName: \"kubernetes.io/projected/0e3edf9d-cddf-4b23-bda2-930fb5cbaf27-kube-api-access-zppkl\") pod \"nmstate-operator-5b5b58f5c8-66p75\" (UID: \"0e3edf9d-cddf-4b23-bda2-930fb5cbaf27\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-66p75" Dec 08 20:16:34 crc kubenswrapper[4781]: I1208 20:16:34.117887 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zppkl\" (UniqueName: \"kubernetes.io/projected/0e3edf9d-cddf-4b23-bda2-930fb5cbaf27-kube-api-access-zppkl\") pod \"nmstate-operator-5b5b58f5c8-66p75\" (UID: \"0e3edf9d-cddf-4b23-bda2-930fb5cbaf27\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-66p75" Dec 08 20:16:34 crc kubenswrapper[4781]: I1208 20:16:34.236671 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-66p75" Dec 08 20:16:34 crc kubenswrapper[4781]: I1208 20:16:34.691318 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-66p75"] Dec 08 20:16:35 crc kubenswrapper[4781]: I1208 20:16:35.339611 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-66p75" event={"ID":"0e3edf9d-cddf-4b23-bda2-930fb5cbaf27","Type":"ContainerStarted","Data":"7a5a63a820b9b716b20c537c4e502c5badbf90e8d7685e2ce2151ab66ae737d1"} Dec 08 20:16:37 crc kubenswrapper[4781]: I1208 20:16:37.355643 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-66p75" event={"ID":"0e3edf9d-cddf-4b23-bda2-930fb5cbaf27","Type":"ContainerStarted","Data":"dc89b4bee55c317db32ad100263f484f598f739cf7cd024b67a832c394da61c0"} Dec 08 20:16:37 crc kubenswrapper[4781]: I1208 20:16:37.384801 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-66p75" podStartSLOduration=2.586979062 podStartE2EDuration="4.384777719s" podCreationTimestamp="2025-12-08 20:16:33 +0000 UTC" firstStartedPulling="2025-12-08 20:16:34.705322126 +0000 UTC m=+710.856605503" lastFinishedPulling="2025-12-08 20:16:36.503120783 +0000 UTC m=+712.654404160" observedRunningTime="2025-12-08 20:16:37.383307186 +0000 UTC m=+713.534590623" watchObservedRunningTime="2025-12-08 20:16:37.384777719 +0000 UTC m=+713.536061086" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.336452 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qjn2k"] Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.337774 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qjn2k" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.344633 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-gtzfp" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.353245 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qjn2k"] Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.357956 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl"] Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.358830 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.360951 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.371659 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl"] Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.380761 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-777jl"] Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.381647 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.460494 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c"] Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.461227 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.463240 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.463464 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.463606 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-6kqn7" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.464961 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26857532-cde7-49e1-924f-eda2b362b6b7-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-bsh4c\" (UID: \"26857532-cde7-49e1-924f-eda2b362b6b7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.465041 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f547b46e-1a15-4d7c-a3c6-0167927eb75c-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-2m6pl\" (UID: \"f547b46e-1a15-4d7c-a3c6-0167927eb75c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.465073 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qbbm\" (UniqueName: \"kubernetes.io/projected/f547b46e-1a15-4d7c-a3c6-0167927eb75c-kube-api-access-8qbbm\") pod \"nmstate-webhook-5f6d4c5ccb-2m6pl\" (UID: \"f547b46e-1a15-4d7c-a3c6-0167927eb75c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.465103 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxl5s\" (UniqueName: \"kubernetes.io/projected/ff52b8c4-e459-42ed-96a1-dead7cdcd8b9-kube-api-access-kxl5s\") pod \"nmstate-metrics-7f946cbc9-qjn2k\" (UID: \"ff52b8c4-e459-42ed-96a1-dead7cdcd8b9\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qjn2k" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.465174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdkf7\" (UniqueName: \"kubernetes.io/projected/26857532-cde7-49e1-924f-eda2b362b6b7-kube-api-access-wdkf7\") pod \"nmstate-console-plugin-7fbb5f6569-bsh4c\" (UID: \"26857532-cde7-49e1-924f-eda2b362b6b7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.465211 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/26857532-cde7-49e1-924f-eda2b362b6b7-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bsh4c\" (UID: \"26857532-cde7-49e1-924f-eda2b362b6b7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.476211 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c"] Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.566856 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxl5s\" (UniqueName: \"kubernetes.io/projected/ff52b8c4-e459-42ed-96a1-dead7cdcd8b9-kube-api-access-kxl5s\") pod \"nmstate-metrics-7f946cbc9-qjn2k\" (UID: \"ff52b8c4-e459-42ed-96a1-dead7cdcd8b9\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qjn2k" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.566909 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74d0c70c-fc95-4b78-855f-96eb01f08c07-nmstate-lock\") pod \"nmstate-handler-777jl\" (UID: \"74d0c70c-fc95-4b78-855f-96eb01f08c07\") " pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.566953 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74d0c70c-fc95-4b78-855f-96eb01f08c07-ovs-socket\") pod \"nmstate-handler-777jl\" (UID: \"74d0c70c-fc95-4b78-855f-96eb01f08c07\") " pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.566989 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdkf7\" (UniqueName: \"kubernetes.io/projected/26857532-cde7-49e1-924f-eda2b362b6b7-kube-api-access-wdkf7\") pod \"nmstate-console-plugin-7fbb5f6569-bsh4c\" (UID: \"26857532-cde7-49e1-924f-eda2b362b6b7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.567219 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdvzj\" (UniqueName: \"kubernetes.io/projected/74d0c70c-fc95-4b78-855f-96eb01f08c07-kube-api-access-kdvzj\") pod \"nmstate-handler-777jl\" (UID: \"74d0c70c-fc95-4b78-855f-96eb01f08c07\") " pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.567326 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/26857532-cde7-49e1-924f-eda2b362b6b7-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bsh4c\" (UID: \"26857532-cde7-49e1-924f-eda2b362b6b7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.567393 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74d0c70c-fc95-4b78-855f-96eb01f08c07-dbus-socket\") pod \"nmstate-handler-777jl\" (UID: \"74d0c70c-fc95-4b78-855f-96eb01f08c07\") " pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.567444 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26857532-cde7-49e1-924f-eda2b362b6b7-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-bsh4c\" (UID: \"26857532-cde7-49e1-924f-eda2b362b6b7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" Dec 08 20:16:38 crc kubenswrapper[4781]: E1208 20:16:38.567481 4781 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.567526 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f547b46e-1a15-4d7c-a3c6-0167927eb75c-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-2m6pl\" (UID: \"f547b46e-1a15-4d7c-a3c6-0167927eb75c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl" Dec 08 20:16:38 crc kubenswrapper[4781]: E1208 20:16:38.567563 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26857532-cde7-49e1-924f-eda2b362b6b7-plugin-serving-cert podName:26857532-cde7-49e1-924f-eda2b362b6b7 nodeName:}" failed. No retries permitted until 2025-12-08 20:16:39.067541512 +0000 UTC m=+715.218824889 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/26857532-cde7-49e1-924f-eda2b362b6b7-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-bsh4c" (UID: "26857532-cde7-49e1-924f-eda2b362b6b7") : secret "plugin-serving-cert" not found Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.567593 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qbbm\" (UniqueName: \"kubernetes.io/projected/f547b46e-1a15-4d7c-a3c6-0167927eb75c-kube-api-access-8qbbm\") pod \"nmstate-webhook-5f6d4c5ccb-2m6pl\" (UID: \"f547b46e-1a15-4d7c-a3c6-0167927eb75c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.568450 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26857532-cde7-49e1-924f-eda2b362b6b7-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-bsh4c\" (UID: \"26857532-cde7-49e1-924f-eda2b362b6b7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.573864 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f547b46e-1a15-4d7c-a3c6-0167927eb75c-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-2m6pl\" (UID: \"f547b46e-1a15-4d7c-a3c6-0167927eb75c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.586884 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdkf7\" (UniqueName: \"kubernetes.io/projected/26857532-cde7-49e1-924f-eda2b362b6b7-kube-api-access-wdkf7\") pod \"nmstate-console-plugin-7fbb5f6569-bsh4c\" (UID: \"26857532-cde7-49e1-924f-eda2b362b6b7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.590015 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qbbm\" (UniqueName: \"kubernetes.io/projected/f547b46e-1a15-4d7c-a3c6-0167927eb75c-kube-api-access-8qbbm\") pod \"nmstate-webhook-5f6d4c5ccb-2m6pl\" (UID: \"f547b46e-1a15-4d7c-a3c6-0167927eb75c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.595610 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxl5s\" (UniqueName: \"kubernetes.io/projected/ff52b8c4-e459-42ed-96a1-dead7cdcd8b9-kube-api-access-kxl5s\") pod \"nmstate-metrics-7f946cbc9-qjn2k\" (UID: \"ff52b8c4-e459-42ed-96a1-dead7cdcd8b9\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qjn2k" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.659893 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c7c8f5b58-b5n5l"] Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.660523 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.662203 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qjn2k" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.668336 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74d0c70c-fc95-4b78-855f-96eb01f08c07-nmstate-lock\") pod \"nmstate-handler-777jl\" (UID: \"74d0c70c-fc95-4b78-855f-96eb01f08c07\") " pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.668389 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74d0c70c-fc95-4b78-855f-96eb01f08c07-ovs-socket\") pod \"nmstate-handler-777jl\" (UID: \"74d0c70c-fc95-4b78-855f-96eb01f08c07\") " pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.668432 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/231f8c6a-1f21-4299-9c05-60cb727bcc32-oauth-serving-cert\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.668459 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/231f8c6a-1f21-4299-9c05-60cb727bcc32-service-ca\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.668432 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74d0c70c-fc95-4b78-855f-96eb01f08c07-nmstate-lock\") pod \"nmstate-handler-777jl\" (UID: \"74d0c70c-fc95-4b78-855f-96eb01f08c07\") " pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.668480 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/231f8c6a-1f21-4299-9c05-60cb727bcc32-trusted-ca-bundle\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.668506 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74d0c70c-fc95-4b78-855f-96eb01f08c07-ovs-socket\") pod \"nmstate-handler-777jl\" (UID: \"74d0c70c-fc95-4b78-855f-96eb01f08c07\") " pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.668574 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdvzj\" (UniqueName: \"kubernetes.io/projected/74d0c70c-fc95-4b78-855f-96eb01f08c07-kube-api-access-kdvzj\") pod \"nmstate-handler-777jl\" (UID: \"74d0c70c-fc95-4b78-855f-96eb01f08c07\") " pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.668616 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/231f8c6a-1f21-4299-9c05-60cb727bcc32-console-oauth-config\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.668709 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74d0c70c-fc95-4b78-855f-96eb01f08c07-dbus-socket\") pod \"nmstate-handler-777jl\" (UID: \"74d0c70c-fc95-4b78-855f-96eb01f08c07\") " pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.668733 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/231f8c6a-1f21-4299-9c05-60cb727bcc32-console-serving-cert\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.668841 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/231f8c6a-1f21-4299-9c05-60cb727bcc32-console-config\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.668877 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whbst\" (UniqueName: \"kubernetes.io/projected/231f8c6a-1f21-4299-9c05-60cb727bcc32-kube-api-access-whbst\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.669051 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74d0c70c-fc95-4b78-855f-96eb01f08c07-dbus-socket\") pod \"nmstate-handler-777jl\" (UID: \"74d0c70c-fc95-4b78-855f-96eb01f08c07\") " pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.679385 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.693992 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c7c8f5b58-b5n5l"] Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.712416 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdvzj\" (UniqueName: \"kubernetes.io/projected/74d0c70c-fc95-4b78-855f-96eb01f08c07-kube-api-access-kdvzj\") pod \"nmstate-handler-777jl\" (UID: \"74d0c70c-fc95-4b78-855f-96eb01f08c07\") " pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.769639 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/231f8c6a-1f21-4299-9c05-60cb727bcc32-console-serving-cert\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.770179 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/231f8c6a-1f21-4299-9c05-60cb727bcc32-console-config\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.770260 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whbst\" (UniqueName: \"kubernetes.io/projected/231f8c6a-1f21-4299-9c05-60cb727bcc32-kube-api-access-whbst\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.770379 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/231f8c6a-1f21-4299-9c05-60cb727bcc32-oauth-serving-cert\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.770414 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/231f8c6a-1f21-4299-9c05-60cb727bcc32-service-ca\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.770436 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/231f8c6a-1f21-4299-9c05-60cb727bcc32-trusted-ca-bundle\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.770490 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/231f8c6a-1f21-4299-9c05-60cb727bcc32-console-oauth-config\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.771215 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/231f8c6a-1f21-4299-9c05-60cb727bcc32-console-config\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.771454 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/231f8c6a-1f21-4299-9c05-60cb727bcc32-oauth-serving-cert\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.771536 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/231f8c6a-1f21-4299-9c05-60cb727bcc32-service-ca\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.772590 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/231f8c6a-1f21-4299-9c05-60cb727bcc32-trusted-ca-bundle\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.776399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/231f8c6a-1f21-4299-9c05-60cb727bcc32-console-serving-cert\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.778288 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/231f8c6a-1f21-4299-9c05-60cb727bcc32-console-oauth-config\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.787791 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whbst\" (UniqueName: \"kubernetes.io/projected/231f8c6a-1f21-4299-9c05-60cb727bcc32-kube-api-access-whbst\") pod \"console-c7c8f5b58-b5n5l\" (UID: \"231f8c6a-1f21-4299-9c05-60cb727bcc32\") " pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.891709 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl"] Dec 08 20:16:38 crc kubenswrapper[4781]: W1208 20:16:38.893123 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf547b46e_1a15_4d7c_a3c6_0167927eb75c.slice/crio-84412a6a344310ebb2978897f54e6ae850c51ca2aa61dbef83aafb80575f1b3e WatchSource:0}: Error finding container 84412a6a344310ebb2978897f54e6ae850c51ca2aa61dbef83aafb80575f1b3e: Status 404 returned error can't find the container with id 84412a6a344310ebb2978897f54e6ae850c51ca2aa61dbef83aafb80575f1b3e Dec 08 20:16:38 crc kubenswrapper[4781]: I1208 20:16:38.999662 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:39 crc kubenswrapper[4781]: W1208 20:16:39.017807 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74d0c70c_fc95_4b78_855f_96eb01f08c07.slice/crio-8071b067540f8cb9559157aae6a928695d74b52ea9fed876065808b1346c5e3e WatchSource:0}: Error finding container 8071b067540f8cb9559157aae6a928695d74b52ea9fed876065808b1346c5e3e: Status 404 returned error can't find the container with id 8071b067540f8cb9559157aae6a928695d74b52ea9fed876065808b1346c5e3e Dec 08 20:16:39 crc kubenswrapper[4781]: I1208 20:16:39.047398 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:39 crc kubenswrapper[4781]: I1208 20:16:39.074232 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/26857532-cde7-49e1-924f-eda2b362b6b7-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bsh4c\" (UID: \"26857532-cde7-49e1-924f-eda2b362b6b7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" Dec 08 20:16:39 crc kubenswrapper[4781]: I1208 20:16:39.077881 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/26857532-cde7-49e1-924f-eda2b362b6b7-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-bsh4c\" (UID: \"26857532-cde7-49e1-924f-eda2b362b6b7\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" Dec 08 20:16:39 crc kubenswrapper[4781]: I1208 20:16:39.162029 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-qjn2k"] Dec 08 20:16:39 crc kubenswrapper[4781]: W1208 20:16:39.170855 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff52b8c4_e459_42ed_96a1_dead7cdcd8b9.slice/crio-b333a04010b00161cddb72e82a833c71620ed233694955519a0bfefeaaf87b42 WatchSource:0}: Error finding container b333a04010b00161cddb72e82a833c71620ed233694955519a0bfefeaaf87b42: Status 404 returned error can't find the container with id b333a04010b00161cddb72e82a833c71620ed233694955519a0bfefeaaf87b42 Dec 08 20:16:39 crc kubenswrapper[4781]: I1208 20:16:39.252810 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c7c8f5b58-b5n5l"] Dec 08 20:16:39 crc kubenswrapper[4781]: W1208 20:16:39.258370 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod231f8c6a_1f21_4299_9c05_60cb727bcc32.slice/crio-17c7989f20907cac35720ec682378302bf42b4a79c5ac628731de89eb7337ad5 WatchSource:0}: Error finding container 17c7989f20907cac35720ec682378302bf42b4a79c5ac628731de89eb7337ad5: Status 404 returned error can't find the container with id 17c7989f20907cac35720ec682378302bf42b4a79c5ac628731de89eb7337ad5 Dec 08 20:16:39 crc kubenswrapper[4781]: I1208 20:16:39.373152 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c7c8f5b58-b5n5l" event={"ID":"231f8c6a-1f21-4299-9c05-60cb727bcc32","Type":"ContainerStarted","Data":"17c7989f20907cac35720ec682378302bf42b4a79c5ac628731de89eb7337ad5"} Dec 08 20:16:39 crc kubenswrapper[4781]: I1208 20:16:39.374419 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-777jl" event={"ID":"74d0c70c-fc95-4b78-855f-96eb01f08c07","Type":"ContainerStarted","Data":"8071b067540f8cb9559157aae6a928695d74b52ea9fed876065808b1346c5e3e"} Dec 08 20:16:39 crc kubenswrapper[4781]: I1208 20:16:39.375639 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" Dec 08 20:16:39 crc kubenswrapper[4781]: I1208 20:16:39.375399 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qjn2k" event={"ID":"ff52b8c4-e459-42ed-96a1-dead7cdcd8b9","Type":"ContainerStarted","Data":"b333a04010b00161cddb72e82a833c71620ed233694955519a0bfefeaaf87b42"} Dec 08 20:16:39 crc kubenswrapper[4781]: I1208 20:16:39.378535 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl" event={"ID":"f547b46e-1a15-4d7c-a3c6-0167927eb75c","Type":"ContainerStarted","Data":"84412a6a344310ebb2978897f54e6ae850c51ca2aa61dbef83aafb80575f1b3e"} Dec 08 20:16:39 crc kubenswrapper[4781]: I1208 20:16:39.556458 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c"] Dec 08 20:16:39 crc kubenswrapper[4781]: W1208 20:16:39.563211 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26857532_cde7_49e1_924f_eda2b362b6b7.slice/crio-74f31de485a624f7c1875e704c58b688c640326d242672a70efeb1cb4f3d39d5 WatchSource:0}: Error finding container 74f31de485a624f7c1875e704c58b688c640326d242672a70efeb1cb4f3d39d5: Status 404 returned error can't find the container with id 74f31de485a624f7c1875e704c58b688c640326d242672a70efeb1cb4f3d39d5 Dec 08 20:16:40 crc kubenswrapper[4781]: I1208 20:16:40.389999 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c7c8f5b58-b5n5l" event={"ID":"231f8c6a-1f21-4299-9c05-60cb727bcc32","Type":"ContainerStarted","Data":"41ac570cff686f1816f74743a807716ec138a15ab28156bb206fad67fb536db6"} Dec 08 20:16:40 crc kubenswrapper[4781]: I1208 20:16:40.391786 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" event={"ID":"26857532-cde7-49e1-924f-eda2b362b6b7","Type":"ContainerStarted","Data":"74f31de485a624f7c1875e704c58b688c640326d242672a70efeb1cb4f3d39d5"} Dec 08 20:16:40 crc kubenswrapper[4781]: I1208 20:16:40.409285 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c7c8f5b58-b5n5l" podStartSLOduration=2.409265096 podStartE2EDuration="2.409265096s" podCreationTimestamp="2025-12-08 20:16:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:16:40.406716792 +0000 UTC m=+716.558000169" watchObservedRunningTime="2025-12-08 20:16:40.409265096 +0000 UTC m=+716.560548473" Dec 08 20:16:41 crc kubenswrapper[4781]: I1208 20:16:41.404738 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-777jl" event={"ID":"74d0c70c-fc95-4b78-855f-96eb01f08c07","Type":"ContainerStarted","Data":"f439485c3de8c06cf8377b158fbf8c886b8154df27ef226ee2c16b9d6da0798b"} Dec 08 20:16:41 crc kubenswrapper[4781]: I1208 20:16:41.405108 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:41 crc kubenswrapper[4781]: I1208 20:16:41.406558 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qjn2k" event={"ID":"ff52b8c4-e459-42ed-96a1-dead7cdcd8b9","Type":"ContainerStarted","Data":"d697929e42c16e0f432e01f19b46fcab8722dea4f636e67771b0570678b8c8a5"} Dec 08 20:16:41 crc kubenswrapper[4781]: I1208 20:16:41.408331 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl" event={"ID":"f547b46e-1a15-4d7c-a3c6-0167927eb75c","Type":"ContainerStarted","Data":"e5bc495b8cb6850b003302facf9c6d89ca487a2bf36e9db76d32330bd76fbde3"} Dec 08 20:16:41 crc kubenswrapper[4781]: I1208 20:16:41.422814 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-777jl" podStartSLOduration=1.378770961 podStartE2EDuration="3.422795261s" podCreationTimestamp="2025-12-08 20:16:38 +0000 UTC" firstStartedPulling="2025-12-08 20:16:39.019673461 +0000 UTC m=+715.170956838" lastFinishedPulling="2025-12-08 20:16:41.063697761 +0000 UTC m=+717.214981138" observedRunningTime="2025-12-08 20:16:41.418794616 +0000 UTC m=+717.570077993" watchObservedRunningTime="2025-12-08 20:16:41.422795261 +0000 UTC m=+717.574078638" Dec 08 20:16:41 crc kubenswrapper[4781]: I1208 20:16:41.436948 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl" podStartSLOduration=1.231440498 podStartE2EDuration="3.436910577s" podCreationTimestamp="2025-12-08 20:16:38 +0000 UTC" firstStartedPulling="2025-12-08 20:16:38.895145444 +0000 UTC m=+715.046428821" lastFinishedPulling="2025-12-08 20:16:41.100615523 +0000 UTC m=+717.251898900" observedRunningTime="2025-12-08 20:16:41.434124717 +0000 UTC m=+717.585408104" watchObservedRunningTime="2025-12-08 20:16:41.436910577 +0000 UTC m=+717.588193954" Dec 08 20:16:42 crc kubenswrapper[4781]: I1208 20:16:42.415530 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" event={"ID":"26857532-cde7-49e1-924f-eda2b362b6b7","Type":"ContainerStarted","Data":"8146f48a2edf6d174637458a6594425a57bd469dbc4651ca60da9450d9fa209c"} Dec 08 20:16:42 crc kubenswrapper[4781]: I1208 20:16:42.416428 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl" Dec 08 20:16:42 crc kubenswrapper[4781]: I1208 20:16:42.436475 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-bsh4c" podStartSLOduration=1.970278711 podStartE2EDuration="4.43645731s" podCreationTimestamp="2025-12-08 20:16:38 +0000 UTC" firstStartedPulling="2025-12-08 20:16:39.565671992 +0000 UTC m=+715.716955369" lastFinishedPulling="2025-12-08 20:16:42.031850591 +0000 UTC m=+718.183133968" observedRunningTime="2025-12-08 20:16:42.433236468 +0000 UTC m=+718.584519865" watchObservedRunningTime="2025-12-08 20:16:42.43645731 +0000 UTC m=+718.587740687" Dec 08 20:16:43 crc kubenswrapper[4781]: I1208 20:16:43.421608 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qjn2k" event={"ID":"ff52b8c4-e459-42ed-96a1-dead7cdcd8b9","Type":"ContainerStarted","Data":"a66f6b3cde6da6f9adbbb3a0bfd6a23bbe117add100f85e7b337f0f0f031d374"} Dec 08 20:16:44 crc kubenswrapper[4781]: I1208 20:16:44.440164 4781 scope.go:117] "RemoveContainer" containerID="0a19a77a66bede932c0b113bf2257987ba53c17493c3db9eee6b2869f005dd4a" Dec 08 20:16:45 crc kubenswrapper[4781]: I1208 20:16:45.438318 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tm5z7_a6e3a0f7-c8ca-46f6-a21a-002c28dd7e20/kube-multus/2.log" Dec 08 20:16:49 crc kubenswrapper[4781]: I1208 20:16:49.024066 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-777jl" Dec 08 20:16:49 crc kubenswrapper[4781]: I1208 20:16:49.040568 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-qjn2k" podStartSLOduration=7.11021506 podStartE2EDuration="11.040552914s" podCreationTimestamp="2025-12-08 20:16:38 +0000 UTC" firstStartedPulling="2025-12-08 20:16:39.173115593 +0000 UTC m=+715.324398970" lastFinishedPulling="2025-12-08 20:16:43.103453447 +0000 UTC m=+719.254736824" observedRunningTime="2025-12-08 20:16:43.446971109 +0000 UTC m=+719.598254526" watchObservedRunningTime="2025-12-08 20:16:49.040552914 +0000 UTC m=+725.191836291" Dec 08 20:16:49 crc kubenswrapper[4781]: I1208 20:16:49.048242 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:49 crc kubenswrapper[4781]: I1208 20:16:49.048355 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:49 crc kubenswrapper[4781]: I1208 20:16:49.052946 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:49 crc kubenswrapper[4781]: I1208 20:16:49.468379 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c7c8f5b58-b5n5l" Dec 08 20:16:49 crc kubenswrapper[4781]: I1208 20:16:49.523517 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tmmk6"] Dec 08 20:16:58 crc kubenswrapper[4781]: I1208 20:16:58.684199 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-2m6pl" Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.362633 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629"] Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.375895 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.378619 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.384112 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629"] Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.568180 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4836df2-487a-4750-b2fc-28c2fd4394f5-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629\" (UID: \"e4836df2-487a-4750-b2fc-28c2fd4394f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.568229 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4836df2-487a-4750-b2fc-28c2fd4394f5-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629\" (UID: \"e4836df2-487a-4750-b2fc-28c2fd4394f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.568275 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fl6f\" (UniqueName: \"kubernetes.io/projected/e4836df2-487a-4750-b2fc-28c2fd4394f5-kube-api-access-5fl6f\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629\" (UID: \"e4836df2-487a-4750-b2fc-28c2fd4394f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.669011 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fl6f\" (UniqueName: \"kubernetes.io/projected/e4836df2-487a-4750-b2fc-28c2fd4394f5-kube-api-access-5fl6f\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629\" (UID: \"e4836df2-487a-4750-b2fc-28c2fd4394f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.669132 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4836df2-487a-4750-b2fc-28c2fd4394f5-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629\" (UID: \"e4836df2-487a-4750-b2fc-28c2fd4394f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.669154 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4836df2-487a-4750-b2fc-28c2fd4394f5-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629\" (UID: \"e4836df2-487a-4750-b2fc-28c2fd4394f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.669579 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4836df2-487a-4750-b2fc-28c2fd4394f5-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629\" (UID: \"e4836df2-487a-4750-b2fc-28c2fd4394f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.669891 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4836df2-487a-4750-b2fc-28c2fd4394f5-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629\" (UID: \"e4836df2-487a-4750-b2fc-28c2fd4394f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.690343 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fl6f\" (UniqueName: \"kubernetes.io/projected/e4836df2-487a-4750-b2fc-28c2fd4394f5-kube-api-access-5fl6f\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629\" (UID: \"e4836df2-487a-4750-b2fc-28c2fd4394f5\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.712347 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" Dec 08 20:17:10 crc kubenswrapper[4781]: I1208 20:17:10.915896 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629"] Dec 08 20:17:10 crc kubenswrapper[4781]: W1208 20:17:10.922361 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4836df2_487a_4750_b2fc_28c2fd4394f5.slice/crio-4e5a30e58abfff2188ad5af3c3dd5f842d1a887e6ca918bfd2ea5fa447d3a851 WatchSource:0}: Error finding container 4e5a30e58abfff2188ad5af3c3dd5f842d1a887e6ca918bfd2ea5fa447d3a851: Status 404 returned error can't find the container with id 4e5a30e58abfff2188ad5af3c3dd5f842d1a887e6ca918bfd2ea5fa447d3a851 Dec 08 20:17:11 crc kubenswrapper[4781]: I1208 20:17:11.589683 4781 generic.go:334] "Generic (PLEG): container finished" podID="e4836df2-487a-4750-b2fc-28c2fd4394f5" containerID="7267afc4a593ee3c5753ea59fe9c7a9f8a24cfe62703b05ca81bf3652164d443" exitCode=0 Dec 08 20:17:11 crc kubenswrapper[4781]: I1208 20:17:11.589774 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" event={"ID":"e4836df2-487a-4750-b2fc-28c2fd4394f5","Type":"ContainerDied","Data":"7267afc4a593ee3c5753ea59fe9c7a9f8a24cfe62703b05ca81bf3652164d443"} Dec 08 20:17:11 crc kubenswrapper[4781]: I1208 20:17:11.590086 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" event={"ID":"e4836df2-487a-4750-b2fc-28c2fd4394f5","Type":"ContainerStarted","Data":"4e5a30e58abfff2188ad5af3c3dd5f842d1a887e6ca918bfd2ea5fa447d3a851"} Dec 08 20:17:12 crc kubenswrapper[4781]: I1208 20:17:12.721293 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wgs7b"] Dec 08 20:17:12 crc kubenswrapper[4781]: I1208 20:17:12.722619 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:12 crc kubenswrapper[4781]: I1208 20:17:12.729883 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgs7b"] Dec 08 20:17:12 crc kubenswrapper[4781]: I1208 20:17:12.896031 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c17dfaf-9acb-4a0d-8251-f87397c80ece-catalog-content\") pod \"redhat-operators-wgs7b\" (UID: \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\") " pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:12 crc kubenswrapper[4781]: I1208 20:17:12.896096 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nbph\" (UniqueName: \"kubernetes.io/projected/0c17dfaf-9acb-4a0d-8251-f87397c80ece-kube-api-access-6nbph\") pod \"redhat-operators-wgs7b\" (UID: \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\") " pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:12 crc kubenswrapper[4781]: I1208 20:17:12.896130 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c17dfaf-9acb-4a0d-8251-f87397c80ece-utilities\") pod \"redhat-operators-wgs7b\" (UID: \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\") " pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:12 crc kubenswrapper[4781]: I1208 20:17:12.996802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nbph\" (UniqueName: \"kubernetes.io/projected/0c17dfaf-9acb-4a0d-8251-f87397c80ece-kube-api-access-6nbph\") pod \"redhat-operators-wgs7b\" (UID: \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\") " pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:12 crc kubenswrapper[4781]: I1208 20:17:12.997012 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c17dfaf-9acb-4a0d-8251-f87397c80ece-utilities\") pod \"redhat-operators-wgs7b\" (UID: \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\") " pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:12 crc kubenswrapper[4781]: I1208 20:17:12.997182 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c17dfaf-9acb-4a0d-8251-f87397c80ece-catalog-content\") pod \"redhat-operators-wgs7b\" (UID: \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\") " pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:12 crc kubenswrapper[4781]: I1208 20:17:12.997510 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c17dfaf-9acb-4a0d-8251-f87397c80ece-utilities\") pod \"redhat-operators-wgs7b\" (UID: \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\") " pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:12 crc kubenswrapper[4781]: I1208 20:17:12.997510 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c17dfaf-9acb-4a0d-8251-f87397c80ece-catalog-content\") pod \"redhat-operators-wgs7b\" (UID: \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\") " pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:13 crc kubenswrapper[4781]: I1208 20:17:13.017293 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nbph\" (UniqueName: \"kubernetes.io/projected/0c17dfaf-9acb-4a0d-8251-f87397c80ece-kube-api-access-6nbph\") pod \"redhat-operators-wgs7b\" (UID: \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\") " pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:13 crc kubenswrapper[4781]: I1208 20:17:13.097604 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:13 crc kubenswrapper[4781]: I1208 20:17:13.309268 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgs7b"] Dec 08 20:17:13 crc kubenswrapper[4781]: I1208 20:17:13.602949 4781 generic.go:334] "Generic (PLEG): container finished" podID="0c17dfaf-9acb-4a0d-8251-f87397c80ece" containerID="414c39e5d7b68bf7e81a1bdd0d2f019be2b4837b80537b7895c52716c602ae8e" exitCode=0 Dec 08 20:17:13 crc kubenswrapper[4781]: I1208 20:17:13.603022 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgs7b" event={"ID":"0c17dfaf-9acb-4a0d-8251-f87397c80ece","Type":"ContainerDied","Data":"414c39e5d7b68bf7e81a1bdd0d2f019be2b4837b80537b7895c52716c602ae8e"} Dec 08 20:17:13 crc kubenswrapper[4781]: I1208 20:17:13.603046 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgs7b" event={"ID":"0c17dfaf-9acb-4a0d-8251-f87397c80ece","Type":"ContainerStarted","Data":"a326b053f0d7f764a0b511c71a99753b134a3df1279c1dfc60b4595e78c3e343"} Dec 08 20:17:13 crc kubenswrapper[4781]: I1208 20:17:13.604873 4781 generic.go:334] "Generic (PLEG): container finished" podID="e4836df2-487a-4750-b2fc-28c2fd4394f5" containerID="8b2b67af69e152277a01e52844318a12bf41e0f5e2ae4decdd3f32bb7a5c42f6" exitCode=0 Dec 08 20:17:13 crc kubenswrapper[4781]: I1208 20:17:13.604904 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" event={"ID":"e4836df2-487a-4750-b2fc-28c2fd4394f5","Type":"ContainerDied","Data":"8b2b67af69e152277a01e52844318a12bf41e0f5e2ae4decdd3f32bb7a5c42f6"} Dec 08 20:17:14 crc kubenswrapper[4781]: I1208 20:17:14.565477 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-tmmk6" podUID="5669be4d-29d4-4cee-ad63-75f37e3727d2" containerName="console" containerID="cri-o://d945090b17ce10c0716b15620ab2476d5cec44d360f2e7437f33c6374f2c9b4a" gracePeriod=15 Dec 08 20:17:14 crc kubenswrapper[4781]: I1208 20:17:14.614533 4781 generic.go:334] "Generic (PLEG): container finished" podID="e4836df2-487a-4750-b2fc-28c2fd4394f5" containerID="c4c8f3aee9593b9b1f4de52dd6184de68b93c629018e77f7514edbaa849a9ba0" exitCode=0 Dec 08 20:17:14 crc kubenswrapper[4781]: I1208 20:17:14.614611 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" event={"ID":"e4836df2-487a-4750-b2fc-28c2fd4394f5","Type":"ContainerDied","Data":"c4c8f3aee9593b9b1f4de52dd6184de68b93c629018e77f7514edbaa849a9ba0"} Dec 08 20:17:14 crc kubenswrapper[4781]: I1208 20:17:14.617123 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgs7b" event={"ID":"0c17dfaf-9acb-4a0d-8251-f87397c80ece","Type":"ContainerStarted","Data":"fe81e6f5b9fddc74059cce1c62849a2c0cb01b6a4335f3d5d062def56465b5db"} Dec 08 20:17:14 crc kubenswrapper[4781]: I1208 20:17:14.891282 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tmmk6_5669be4d-29d4-4cee-ad63-75f37e3727d2/console/0.log" Dec 08 20:17:14 crc kubenswrapper[4781]: I1208 20:17:14.891364 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.019630 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-service-ca\") pod \"5669be4d-29d4-4cee-ad63-75f37e3727d2\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.019685 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-oauth-serving-cert\") pod \"5669be4d-29d4-4cee-ad63-75f37e3727d2\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.019757 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-trusted-ca-bundle\") pod \"5669be4d-29d4-4cee-ad63-75f37e3727d2\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.019777 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-serving-cert\") pod \"5669be4d-29d4-4cee-ad63-75f37e3727d2\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.019829 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6zj6\" (UniqueName: \"kubernetes.io/projected/5669be4d-29d4-4cee-ad63-75f37e3727d2-kube-api-access-p6zj6\") pod \"5669be4d-29d4-4cee-ad63-75f37e3727d2\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.020611 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5669be4d-29d4-4cee-ad63-75f37e3727d2" (UID: "5669be4d-29d4-4cee-ad63-75f37e3727d2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.020768 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-config\") pod \"5669be4d-29d4-4cee-ad63-75f37e3727d2\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.020781 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5669be4d-29d4-4cee-ad63-75f37e3727d2" (UID: "5669be4d-29d4-4cee-ad63-75f37e3727d2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.020801 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-oauth-config\") pod \"5669be4d-29d4-4cee-ad63-75f37e3727d2\" (UID: \"5669be4d-29d4-4cee-ad63-75f37e3727d2\") " Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.021021 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-service-ca" (OuterVolumeSpecName: "service-ca") pod "5669be4d-29d4-4cee-ad63-75f37e3727d2" (UID: "5669be4d-29d4-4cee-ad63-75f37e3727d2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.021321 4781 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-service-ca\") on node \"crc\" DevicePath \"\"" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.021340 4781 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.021354 4781 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.021506 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-config" (OuterVolumeSpecName: "console-config") pod "5669be4d-29d4-4cee-ad63-75f37e3727d2" (UID: "5669be4d-29d4-4cee-ad63-75f37e3727d2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.025984 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5669be4d-29d4-4cee-ad63-75f37e3727d2" (UID: "5669be4d-29d4-4cee-ad63-75f37e3727d2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.026200 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5669be4d-29d4-4cee-ad63-75f37e3727d2" (UID: "5669be4d-29d4-4cee-ad63-75f37e3727d2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.026448 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5669be4d-29d4-4cee-ad63-75f37e3727d2-kube-api-access-p6zj6" (OuterVolumeSpecName: "kube-api-access-p6zj6") pod "5669be4d-29d4-4cee-ad63-75f37e3727d2" (UID: "5669be4d-29d4-4cee-ad63-75f37e3727d2"). InnerVolumeSpecName "kube-api-access-p6zj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.122719 4781 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.122752 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6zj6\" (UniqueName: \"kubernetes.io/projected/5669be4d-29d4-4cee-ad63-75f37e3727d2-kube-api-access-p6zj6\") on node \"crc\" DevicePath \"\"" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.122762 4781 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.122770 4781 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5669be4d-29d4-4cee-ad63-75f37e3727d2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.626679 4781 generic.go:334] "Generic (PLEG): container finished" podID="0c17dfaf-9acb-4a0d-8251-f87397c80ece" containerID="fe81e6f5b9fddc74059cce1c62849a2c0cb01b6a4335f3d5d062def56465b5db" exitCode=0 Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.626769 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgs7b" event={"ID":"0c17dfaf-9acb-4a0d-8251-f87397c80ece","Type":"ContainerDied","Data":"fe81e6f5b9fddc74059cce1c62849a2c0cb01b6a4335f3d5d062def56465b5db"} Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.629589 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-tmmk6_5669be4d-29d4-4cee-ad63-75f37e3727d2/console/0.log" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.629654 4781 generic.go:334] "Generic (PLEG): container finished" podID="5669be4d-29d4-4cee-ad63-75f37e3727d2" containerID="d945090b17ce10c0716b15620ab2476d5cec44d360f2e7437f33c6374f2c9b4a" exitCode=2 Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.629706 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tmmk6" event={"ID":"5669be4d-29d4-4cee-ad63-75f37e3727d2","Type":"ContainerDied","Data":"d945090b17ce10c0716b15620ab2476d5cec44d360f2e7437f33c6374f2c9b4a"} Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.629772 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-tmmk6" event={"ID":"5669be4d-29d4-4cee-ad63-75f37e3727d2","Type":"ContainerDied","Data":"7c55f3a079d194b62af3979b302cba81c43c52fab5c2ee5fd903e38be37609de"} Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.629722 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-tmmk6" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.629793 4781 scope.go:117] "RemoveContainer" containerID="d945090b17ce10c0716b15620ab2476d5cec44d360f2e7437f33c6374f2c9b4a" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.651987 4781 scope.go:117] "RemoveContainer" containerID="d945090b17ce10c0716b15620ab2476d5cec44d360f2e7437f33c6374f2c9b4a" Dec 08 20:17:15 crc kubenswrapper[4781]: E1208 20:17:15.652421 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d945090b17ce10c0716b15620ab2476d5cec44d360f2e7437f33c6374f2c9b4a\": container with ID starting with d945090b17ce10c0716b15620ab2476d5cec44d360f2e7437f33c6374f2c9b4a not found: ID does not exist" containerID="d945090b17ce10c0716b15620ab2476d5cec44d360f2e7437f33c6374f2c9b4a" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.652448 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d945090b17ce10c0716b15620ab2476d5cec44d360f2e7437f33c6374f2c9b4a"} err="failed to get container status \"d945090b17ce10c0716b15620ab2476d5cec44d360f2e7437f33c6374f2c9b4a\": rpc error: code = NotFound desc = could not find container \"d945090b17ce10c0716b15620ab2476d5cec44d360f2e7437f33c6374f2c9b4a\": container with ID starting with d945090b17ce10c0716b15620ab2476d5cec44d360f2e7437f33c6374f2c9b4a not found: ID does not exist" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.664960 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-tmmk6"] Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.669103 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-tmmk6"] Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.827387 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.932349 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4836df2-487a-4750-b2fc-28c2fd4394f5-util\") pod \"e4836df2-487a-4750-b2fc-28c2fd4394f5\" (UID: \"e4836df2-487a-4750-b2fc-28c2fd4394f5\") " Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.932409 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4836df2-487a-4750-b2fc-28c2fd4394f5-bundle\") pod \"e4836df2-487a-4750-b2fc-28c2fd4394f5\" (UID: \"e4836df2-487a-4750-b2fc-28c2fd4394f5\") " Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.932457 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fl6f\" (UniqueName: \"kubernetes.io/projected/e4836df2-487a-4750-b2fc-28c2fd4394f5-kube-api-access-5fl6f\") pod \"e4836df2-487a-4750-b2fc-28c2fd4394f5\" (UID: \"e4836df2-487a-4750-b2fc-28c2fd4394f5\") " Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.933394 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4836df2-487a-4750-b2fc-28c2fd4394f5-bundle" (OuterVolumeSpecName: "bundle") pod "e4836df2-487a-4750-b2fc-28c2fd4394f5" (UID: "e4836df2-487a-4750-b2fc-28c2fd4394f5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.937267 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4836df2-487a-4750-b2fc-28c2fd4394f5-kube-api-access-5fl6f" (OuterVolumeSpecName: "kube-api-access-5fl6f") pod "e4836df2-487a-4750-b2fc-28c2fd4394f5" (UID: "e4836df2-487a-4750-b2fc-28c2fd4394f5"). InnerVolumeSpecName "kube-api-access-5fl6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:17:15 crc kubenswrapper[4781]: I1208 20:17:15.946448 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4836df2-487a-4750-b2fc-28c2fd4394f5-util" (OuterVolumeSpecName: "util") pod "e4836df2-487a-4750-b2fc-28c2fd4394f5" (UID: "e4836df2-487a-4750-b2fc-28c2fd4394f5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:17:16 crc kubenswrapper[4781]: I1208 20:17:16.033237 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fl6f\" (UniqueName: \"kubernetes.io/projected/e4836df2-487a-4750-b2fc-28c2fd4394f5-kube-api-access-5fl6f\") on node \"crc\" DevicePath \"\"" Dec 08 20:17:16 crc kubenswrapper[4781]: I1208 20:17:16.033270 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4836df2-487a-4750-b2fc-28c2fd4394f5-util\") on node \"crc\" DevicePath \"\"" Dec 08 20:17:16 crc kubenswrapper[4781]: I1208 20:17:16.033279 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4836df2-487a-4750-b2fc-28c2fd4394f5-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:17:16 crc kubenswrapper[4781]: I1208 20:17:16.136136 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5669be4d-29d4-4cee-ad63-75f37e3727d2" path="/var/lib/kubelet/pods/5669be4d-29d4-4cee-ad63-75f37e3727d2/volumes" Dec 08 20:17:16 crc kubenswrapper[4781]: I1208 20:17:16.651770 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgs7b" event={"ID":"0c17dfaf-9acb-4a0d-8251-f87397c80ece","Type":"ContainerStarted","Data":"88afe53c5a8eee7da8cda766b95b369c181a7607767c4baa97fc70c479a2aec2"} Dec 08 20:17:16 crc kubenswrapper[4781]: I1208 20:17:16.656604 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" event={"ID":"e4836df2-487a-4750-b2fc-28c2fd4394f5","Type":"ContainerDied","Data":"4e5a30e58abfff2188ad5af3c3dd5f842d1a887e6ca918bfd2ea5fa447d3a851"} Dec 08 20:17:16 crc kubenswrapper[4781]: I1208 20:17:16.656639 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e5a30e58abfff2188ad5af3c3dd5f842d1a887e6ca918bfd2ea5fa447d3a851" Dec 08 20:17:16 crc kubenswrapper[4781]: I1208 20:17:16.656694 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629" Dec 08 20:17:16 crc kubenswrapper[4781]: I1208 20:17:16.675330 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wgs7b" podStartSLOduration=2.287845319 podStartE2EDuration="4.675309877s" podCreationTimestamp="2025-12-08 20:17:12 +0000 UTC" firstStartedPulling="2025-12-08 20:17:13.605411578 +0000 UTC m=+749.756694955" lastFinishedPulling="2025-12-08 20:17:15.992876126 +0000 UTC m=+752.144159513" observedRunningTime="2025-12-08 20:17:16.673237277 +0000 UTC m=+752.824520694" watchObservedRunningTime="2025-12-08 20:17:16.675309877 +0000 UTC m=+752.826593264" Dec 08 20:17:17 crc kubenswrapper[4781]: I1208 20:17:17.197620 4781 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 08 20:17:23 crc kubenswrapper[4781]: I1208 20:17:23.098313 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:23 crc kubenswrapper[4781]: I1208 20:17:23.099019 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:23 crc kubenswrapper[4781]: I1208 20:17:23.165904 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:23 crc kubenswrapper[4781]: I1208 20:17:23.750679 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.511988 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgs7b"] Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.645741 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5"] Dec 08 20:17:25 crc kubenswrapper[4781]: E1208 20:17:25.645969 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4836df2-487a-4750-b2fc-28c2fd4394f5" containerName="util" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.645980 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4836df2-487a-4750-b2fc-28c2fd4394f5" containerName="util" Dec 08 20:17:25 crc kubenswrapper[4781]: E1208 20:17:25.645989 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4836df2-487a-4750-b2fc-28c2fd4394f5" containerName="pull" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.645996 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4836df2-487a-4750-b2fc-28c2fd4394f5" containerName="pull" Dec 08 20:17:25 crc kubenswrapper[4781]: E1208 20:17:25.646006 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5669be4d-29d4-4cee-ad63-75f37e3727d2" containerName="console" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.646012 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5669be4d-29d4-4cee-ad63-75f37e3727d2" containerName="console" Dec 08 20:17:25 crc kubenswrapper[4781]: E1208 20:17:25.646020 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4836df2-487a-4750-b2fc-28c2fd4394f5" containerName="extract" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.646026 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4836df2-487a-4750-b2fc-28c2fd4394f5" containerName="extract" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.646133 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4836df2-487a-4750-b2fc-28c2fd4394f5" containerName="extract" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.646144 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5669be4d-29d4-4cee-ad63-75f37e3727d2" containerName="console" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.646507 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.648321 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.648562 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.658113 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-7cwgg" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.658115 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.661165 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.664945 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5"] Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.702512 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wgs7b" podUID="0c17dfaf-9acb-4a0d-8251-f87397c80ece" containerName="registry-server" containerID="cri-o://88afe53c5a8eee7da8cda766b95b369c181a7607767c4baa97fc70c479a2aec2" gracePeriod=2 Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.756126 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd0edabf-1168-40aa-b197-c87637432272-apiservice-cert\") pod \"metallb-operator-controller-manager-79f7dffd6f-8vls5\" (UID: \"dd0edabf-1168-40aa-b197-c87637432272\") " pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.756173 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd0edabf-1168-40aa-b197-c87637432272-webhook-cert\") pod \"metallb-operator-controller-manager-79f7dffd6f-8vls5\" (UID: \"dd0edabf-1168-40aa-b197-c87637432272\") " pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.756196 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9vwf\" (UniqueName: \"kubernetes.io/projected/dd0edabf-1168-40aa-b197-c87637432272-kube-api-access-m9vwf\") pod \"metallb-operator-controller-manager-79f7dffd6f-8vls5\" (UID: \"dd0edabf-1168-40aa-b197-c87637432272\") " pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.857154 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd0edabf-1168-40aa-b197-c87637432272-apiservice-cert\") pod \"metallb-operator-controller-manager-79f7dffd6f-8vls5\" (UID: \"dd0edabf-1168-40aa-b197-c87637432272\") " pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.857211 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd0edabf-1168-40aa-b197-c87637432272-webhook-cert\") pod \"metallb-operator-controller-manager-79f7dffd6f-8vls5\" (UID: \"dd0edabf-1168-40aa-b197-c87637432272\") " pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.857238 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9vwf\" (UniqueName: \"kubernetes.io/projected/dd0edabf-1168-40aa-b197-c87637432272-kube-api-access-m9vwf\") pod \"metallb-operator-controller-manager-79f7dffd6f-8vls5\" (UID: \"dd0edabf-1168-40aa-b197-c87637432272\") " pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.864037 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd0edabf-1168-40aa-b197-c87637432272-webhook-cert\") pod \"metallb-operator-controller-manager-79f7dffd6f-8vls5\" (UID: \"dd0edabf-1168-40aa-b197-c87637432272\") " pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.864427 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd0edabf-1168-40aa-b197-c87637432272-apiservice-cert\") pod \"metallb-operator-controller-manager-79f7dffd6f-8vls5\" (UID: \"dd0edabf-1168-40aa-b197-c87637432272\") " pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.927955 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9vwf\" (UniqueName: \"kubernetes.io/projected/dd0edabf-1168-40aa-b197-c87637432272-kube-api-access-m9vwf\") pod \"metallb-operator-controller-manager-79f7dffd6f-8vls5\" (UID: \"dd0edabf-1168-40aa-b197-c87637432272\") " pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" Dec 08 20:17:25 crc kubenswrapper[4781]: I1208 20:17:25.972491 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.107959 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-dff497d76-fmghx"] Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.108975 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.112444 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.112494 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.112446 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-tlz2v" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.122952 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-dff497d76-fmghx"] Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.263494 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f195b37d-be68-434b-8295-23a8208108b8-apiservice-cert\") pod \"metallb-operator-webhook-server-dff497d76-fmghx\" (UID: \"f195b37d-be68-434b-8295-23a8208108b8\") " pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.263623 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgg66\" (UniqueName: \"kubernetes.io/projected/f195b37d-be68-434b-8295-23a8208108b8-kube-api-access-fgg66\") pod \"metallb-operator-webhook-server-dff497d76-fmghx\" (UID: \"f195b37d-be68-434b-8295-23a8208108b8\") " pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.263666 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f195b37d-be68-434b-8295-23a8208108b8-webhook-cert\") pod \"metallb-operator-webhook-server-dff497d76-fmghx\" (UID: \"f195b37d-be68-434b-8295-23a8208108b8\") " pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.364986 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f195b37d-be68-434b-8295-23a8208108b8-apiservice-cert\") pod \"metallb-operator-webhook-server-dff497d76-fmghx\" (UID: \"f195b37d-be68-434b-8295-23a8208108b8\") " pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.365111 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgg66\" (UniqueName: \"kubernetes.io/projected/f195b37d-be68-434b-8295-23a8208108b8-kube-api-access-fgg66\") pod \"metallb-operator-webhook-server-dff497d76-fmghx\" (UID: \"f195b37d-be68-434b-8295-23a8208108b8\") " pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.365177 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f195b37d-be68-434b-8295-23a8208108b8-webhook-cert\") pod \"metallb-operator-webhook-server-dff497d76-fmghx\" (UID: \"f195b37d-be68-434b-8295-23a8208108b8\") " pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.370539 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f195b37d-be68-434b-8295-23a8208108b8-apiservice-cert\") pod \"metallb-operator-webhook-server-dff497d76-fmghx\" (UID: \"f195b37d-be68-434b-8295-23a8208108b8\") " pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.371427 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f195b37d-be68-434b-8295-23a8208108b8-webhook-cert\") pod \"metallb-operator-webhook-server-dff497d76-fmghx\" (UID: \"f195b37d-be68-434b-8295-23a8208108b8\") " pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.391241 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgg66\" (UniqueName: \"kubernetes.io/projected/f195b37d-be68-434b-8295-23a8208108b8-kube-api-access-fgg66\") pod \"metallb-operator-webhook-server-dff497d76-fmghx\" (UID: \"f195b37d-be68-434b-8295-23a8208108b8\") " pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.451893 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.503677 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5"] Dec 08 20:17:26 crc kubenswrapper[4781]: W1208 20:17:26.514782 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd0edabf_1168_40aa_b197_c87637432272.slice/crio-de8571e77abbccb3ab061f456bc74f6fc899896210cfd6550d76e3e8aeb9c635 WatchSource:0}: Error finding container de8571e77abbccb3ab061f456bc74f6fc899896210cfd6550d76e3e8aeb9c635: Status 404 returned error can't find the container with id de8571e77abbccb3ab061f456bc74f6fc899896210cfd6550d76e3e8aeb9c635 Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.663856 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-dff497d76-fmghx"] Dec 08 20:17:26 crc kubenswrapper[4781]: W1208 20:17:26.669043 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf195b37d_be68_434b_8295_23a8208108b8.slice/crio-f021ea209deb902836401bd50d6d1dee29b2e30525cdba1db219bd0d9791af4b WatchSource:0}: Error finding container f021ea209deb902836401bd50d6d1dee29b2e30525cdba1db219bd0d9791af4b: Status 404 returned error can't find the container with id f021ea209deb902836401bd50d6d1dee29b2e30525cdba1db219bd0d9791af4b Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.709600 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" event={"ID":"f195b37d-be68-434b-8295-23a8208108b8","Type":"ContainerStarted","Data":"f021ea209deb902836401bd50d6d1dee29b2e30525cdba1db219bd0d9791af4b"} Dec 08 20:17:26 crc kubenswrapper[4781]: I1208 20:17:26.711579 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" event={"ID":"dd0edabf-1168-40aa-b197-c87637432272","Type":"ContainerStarted","Data":"de8571e77abbccb3ab061f456bc74f6fc899896210cfd6550d76e3e8aeb9c635"} Dec 08 20:17:28 crc kubenswrapper[4781]: I1208 20:17:28.734401 4781 generic.go:334] "Generic (PLEG): container finished" podID="0c17dfaf-9acb-4a0d-8251-f87397c80ece" containerID="88afe53c5a8eee7da8cda766b95b369c181a7607767c4baa97fc70c479a2aec2" exitCode=0 Dec 08 20:17:28 crc kubenswrapper[4781]: I1208 20:17:28.735027 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgs7b" event={"ID":"0c17dfaf-9acb-4a0d-8251-f87397c80ece","Type":"ContainerDied","Data":"88afe53c5a8eee7da8cda766b95b369c181a7607767c4baa97fc70c479a2aec2"} Dec 08 20:17:28 crc kubenswrapper[4781]: I1208 20:17:28.735067 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgs7b" event={"ID":"0c17dfaf-9acb-4a0d-8251-f87397c80ece","Type":"ContainerDied","Data":"a326b053f0d7f764a0b511c71a99753b134a3df1279c1dfc60b4595e78c3e343"} Dec 08 20:17:28 crc kubenswrapper[4781]: I1208 20:17:28.735082 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a326b053f0d7f764a0b511c71a99753b134a3df1279c1dfc60b4595e78c3e343" Dec 08 20:17:28 crc kubenswrapper[4781]: I1208 20:17:28.762892 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:28 crc kubenswrapper[4781]: I1208 20:17:28.902502 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c17dfaf-9acb-4a0d-8251-f87397c80ece-utilities\") pod \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\" (UID: \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\") " Dec 08 20:17:28 crc kubenswrapper[4781]: I1208 20:17:28.902598 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c17dfaf-9acb-4a0d-8251-f87397c80ece-catalog-content\") pod \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\" (UID: \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\") " Dec 08 20:17:28 crc kubenswrapper[4781]: I1208 20:17:28.902648 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nbph\" (UniqueName: \"kubernetes.io/projected/0c17dfaf-9acb-4a0d-8251-f87397c80ece-kube-api-access-6nbph\") pod \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\" (UID: \"0c17dfaf-9acb-4a0d-8251-f87397c80ece\") " Dec 08 20:17:28 crc kubenswrapper[4781]: I1208 20:17:28.903530 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c17dfaf-9acb-4a0d-8251-f87397c80ece-utilities" (OuterVolumeSpecName: "utilities") pod "0c17dfaf-9acb-4a0d-8251-f87397c80ece" (UID: "0c17dfaf-9acb-4a0d-8251-f87397c80ece"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:17:28 crc kubenswrapper[4781]: I1208 20:17:28.908192 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c17dfaf-9acb-4a0d-8251-f87397c80ece-kube-api-access-6nbph" (OuterVolumeSpecName: "kube-api-access-6nbph") pod "0c17dfaf-9acb-4a0d-8251-f87397c80ece" (UID: "0c17dfaf-9acb-4a0d-8251-f87397c80ece"). InnerVolumeSpecName "kube-api-access-6nbph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:17:29 crc kubenswrapper[4781]: I1208 20:17:29.004018 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c17dfaf-9acb-4a0d-8251-f87397c80ece-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:17:29 crc kubenswrapper[4781]: I1208 20:17:29.004052 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nbph\" (UniqueName: \"kubernetes.io/projected/0c17dfaf-9acb-4a0d-8251-f87397c80ece-kube-api-access-6nbph\") on node \"crc\" DevicePath \"\"" Dec 08 20:17:29 crc kubenswrapper[4781]: I1208 20:17:29.050145 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c17dfaf-9acb-4a0d-8251-f87397c80ece-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c17dfaf-9acb-4a0d-8251-f87397c80ece" (UID: "0c17dfaf-9acb-4a0d-8251-f87397c80ece"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:17:29 crc kubenswrapper[4781]: I1208 20:17:29.105663 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c17dfaf-9acb-4a0d-8251-f87397c80ece-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:17:29 crc kubenswrapper[4781]: I1208 20:17:29.747442 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgs7b" Dec 08 20:17:29 crc kubenswrapper[4781]: I1208 20:17:29.792380 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgs7b"] Dec 08 20:17:29 crc kubenswrapper[4781]: I1208 20:17:29.804837 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wgs7b"] Dec 08 20:17:30 crc kubenswrapper[4781]: I1208 20:17:30.135304 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c17dfaf-9acb-4a0d-8251-f87397c80ece" path="/var/lib/kubelet/pods/0c17dfaf-9acb-4a0d-8251-f87397c80ece/volumes" Dec 08 20:17:30 crc kubenswrapper[4781]: I1208 20:17:30.756754 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" event={"ID":"dd0edabf-1168-40aa-b197-c87637432272","Type":"ContainerStarted","Data":"fbce1dedd3f6d96cbb5eb5471b9cce6cbc7f1fe65235c721ff69ace87e558687"} Dec 08 20:17:30 crc kubenswrapper[4781]: I1208 20:17:30.757144 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" Dec 08 20:17:30 crc kubenswrapper[4781]: I1208 20:17:30.783636 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" podStartSLOduration=1.7575959239999999 podStartE2EDuration="5.783614177s" podCreationTimestamp="2025-12-08 20:17:25 +0000 UTC" firstStartedPulling="2025-12-08 20:17:26.517118686 +0000 UTC m=+762.668402063" lastFinishedPulling="2025-12-08 20:17:30.543136939 +0000 UTC m=+766.694420316" observedRunningTime="2025-12-08 20:17:30.78024734 +0000 UTC m=+766.931530707" watchObservedRunningTime="2025-12-08 20:17:30.783614177 +0000 UTC m=+766.934897544" Dec 08 20:17:32 crc kubenswrapper[4781]: I1208 20:17:32.769196 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" event={"ID":"f195b37d-be68-434b-8295-23a8208108b8","Type":"ContainerStarted","Data":"e2eb5d795231d967418622e73c5d03c6b22f2a740d5a86b882d9a72f0b45c2d0"} Dec 08 20:17:32 crc kubenswrapper[4781]: I1208 20:17:32.769564 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" Dec 08 20:17:32 crc kubenswrapper[4781]: I1208 20:17:32.788430 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" podStartSLOduration=1.448401106 podStartE2EDuration="6.788412617s" podCreationTimestamp="2025-12-08 20:17:26 +0000 UTC" firstStartedPulling="2025-12-08 20:17:26.671858578 +0000 UTC m=+762.823141955" lastFinishedPulling="2025-12-08 20:17:32.011870079 +0000 UTC m=+768.163153466" observedRunningTime="2025-12-08 20:17:32.784238887 +0000 UTC m=+768.935522264" watchObservedRunningTime="2025-12-08 20:17:32.788412617 +0000 UTC m=+768.939695994" Dec 08 20:17:46 crc kubenswrapper[4781]: I1208 20:17:46.457132 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-dff497d76-fmghx" Dec 08 20:17:59 crc kubenswrapper[4781]: I1208 20:17:59.948031 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:17:59 crc kubenswrapper[4781]: I1208 20:17:59.948584 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:18:05 crc kubenswrapper[4781]: I1208 20:18:05.975550 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-79f7dffd6f-8vls5" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.696933 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps"] Dec 08 20:18:06 crc kubenswrapper[4781]: E1208 20:18:06.697261 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c17dfaf-9acb-4a0d-8251-f87397c80ece" containerName="extract-utilities" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.697286 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c17dfaf-9acb-4a0d-8251-f87397c80ece" containerName="extract-utilities" Dec 08 20:18:06 crc kubenswrapper[4781]: E1208 20:18:06.697305 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c17dfaf-9acb-4a0d-8251-f87397c80ece" containerName="extract-content" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.697316 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c17dfaf-9acb-4a0d-8251-f87397c80ece" containerName="extract-content" Dec 08 20:18:06 crc kubenswrapper[4781]: E1208 20:18:06.697336 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c17dfaf-9acb-4a0d-8251-f87397c80ece" containerName="registry-server" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.697344 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c17dfaf-9acb-4a0d-8251-f87397c80ece" containerName="registry-server" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.697489 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c17dfaf-9acb-4a0d-8251-f87397c80ece" containerName="registry-server" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.698015 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.699795 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.700019 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-n94q6" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.703491 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-982hg"] Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.706099 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.711274 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.711552 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.716525 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps"] Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.786718 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7rq9\" (UniqueName: \"kubernetes.io/projected/8bf0dbea-2341-414b-89fc-e36f1150ee8e-kube-api-access-c7rq9\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.786792 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17dc730c-3861-434c-aeed-de6287a1c55b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-lwsps\" (UID: \"17dc730c-3861-434c-aeed-de6287a1c55b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.786853 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8bf0dbea-2341-414b-89fc-e36f1150ee8e-frr-sockets\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.786877 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8bf0dbea-2341-414b-89fc-e36f1150ee8e-frr-conf\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.787078 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8bf0dbea-2341-414b-89fc-e36f1150ee8e-frr-startup\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.787116 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8bf0dbea-2341-414b-89fc-e36f1150ee8e-metrics\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.787213 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8bf0dbea-2341-414b-89fc-e36f1150ee8e-metrics-certs\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.787249 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8bf0dbea-2341-414b-89fc-e36f1150ee8e-reloader\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.787272 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmx69\" (UniqueName: \"kubernetes.io/projected/17dc730c-3861-434c-aeed-de6287a1c55b-kube-api-access-fmx69\") pod \"frr-k8s-webhook-server-7fcb986d4-lwsps\" (UID: \"17dc730c-3861-434c-aeed-de6287a1c55b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.791468 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-b4s7x"] Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.792561 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-b4s7x" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.795910 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.796504 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.797623 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-4f9p2" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.798311 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.819030 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-k4scp"] Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.819956 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-k4scp" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.822273 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.834380 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-k4scp"] Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.888775 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee2c128a-e154-4685-957c-1cd19b86f113-cert\") pod \"controller-f8648f98b-k4scp\" (UID: \"ee2c128a-e154-4685-957c-1cd19b86f113\") " pod="metallb-system/controller-f8648f98b-k4scp" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.888827 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8bf0dbea-2341-414b-89fc-e36f1150ee8e-reloader\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.888848 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8bf0dbea-2341-414b-89fc-e36f1150ee8e-metrics-certs\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.888874 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmx69\" (UniqueName: \"kubernetes.io/projected/17dc730c-3861-434c-aeed-de6287a1c55b-kube-api-access-fmx69\") pod \"frr-k8s-webhook-server-7fcb986d4-lwsps\" (UID: \"17dc730c-3861-434c-aeed-de6287a1c55b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.888904 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/95f89c56-40b6-4d4b-8060-154674b55a18-metallb-excludel2\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.888944 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-metrics-certs\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.888976 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7rq9\" (UniqueName: \"kubernetes.io/projected/8bf0dbea-2341-414b-89fc-e36f1150ee8e-kube-api-access-c7rq9\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.889004 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8bf0dbea-2341-414b-89fc-e36f1150ee8e-frr-sockets\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.889024 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8bf0dbea-2341-414b-89fc-e36f1150ee8e-frr-conf\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.889043 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17dc730c-3861-434c-aeed-de6287a1c55b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-lwsps\" (UID: \"17dc730c-3861-434c-aeed-de6287a1c55b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.889096 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2c128a-e154-4685-957c-1cd19b86f113-metrics-certs\") pod \"controller-f8648f98b-k4scp\" (UID: \"ee2c128a-e154-4685-957c-1cd19b86f113\") " pod="metallb-system/controller-f8648f98b-k4scp" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.889119 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpz2z\" (UniqueName: \"kubernetes.io/projected/ee2c128a-e154-4685-957c-1cd19b86f113-kube-api-access-kpz2z\") pod \"controller-f8648f98b-k4scp\" (UID: \"ee2c128a-e154-4685-957c-1cd19b86f113\") " pod="metallb-system/controller-f8648f98b-k4scp" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.889146 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-memberlist\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.889181 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8bf0dbea-2341-414b-89fc-e36f1150ee8e-frr-startup\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.889210 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8bf0dbea-2341-414b-89fc-e36f1150ee8e-metrics\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.889237 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx9nl\" (UniqueName: \"kubernetes.io/projected/95f89c56-40b6-4d4b-8060-154674b55a18-kube-api-access-hx9nl\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.889412 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8bf0dbea-2341-414b-89fc-e36f1150ee8e-reloader\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.889663 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8bf0dbea-2341-414b-89fc-e36f1150ee8e-frr-sockets\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.890294 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8bf0dbea-2341-414b-89fc-e36f1150ee8e-frr-startup\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.891034 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8bf0dbea-2341-414b-89fc-e36f1150ee8e-metrics\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.891082 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8bf0dbea-2341-414b-89fc-e36f1150ee8e-frr-conf\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.895619 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8bf0dbea-2341-414b-89fc-e36f1150ee8e-metrics-certs\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.896728 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/17dc730c-3861-434c-aeed-de6287a1c55b-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-lwsps\" (UID: \"17dc730c-3861-434c-aeed-de6287a1c55b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.906425 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7rq9\" (UniqueName: \"kubernetes.io/projected/8bf0dbea-2341-414b-89fc-e36f1150ee8e-kube-api-access-c7rq9\") pod \"frr-k8s-982hg\" (UID: \"8bf0dbea-2341-414b-89fc-e36f1150ee8e\") " pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.925178 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmx69\" (UniqueName: \"kubernetes.io/projected/17dc730c-3861-434c-aeed-de6287a1c55b-kube-api-access-fmx69\") pod \"frr-k8s-webhook-server-7fcb986d4-lwsps\" (UID: \"17dc730c-3861-434c-aeed-de6287a1c55b\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.990041 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee2c128a-e154-4685-957c-1cd19b86f113-cert\") pod \"controller-f8648f98b-k4scp\" (UID: \"ee2c128a-e154-4685-957c-1cd19b86f113\") " pod="metallb-system/controller-f8648f98b-k4scp" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.990087 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/95f89c56-40b6-4d4b-8060-154674b55a18-metallb-excludel2\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.990102 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-metrics-certs\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.990152 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2c128a-e154-4685-957c-1cd19b86f113-metrics-certs\") pod \"controller-f8648f98b-k4scp\" (UID: \"ee2c128a-e154-4685-957c-1cd19b86f113\") " pod="metallb-system/controller-f8648f98b-k4scp" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.990171 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-memberlist\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.990185 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpz2z\" (UniqueName: \"kubernetes.io/projected/ee2c128a-e154-4685-957c-1cd19b86f113-kube-api-access-kpz2z\") pod \"controller-f8648f98b-k4scp\" (UID: \"ee2c128a-e154-4685-957c-1cd19b86f113\") " pod="metallb-system/controller-f8648f98b-k4scp" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.990221 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx9nl\" (UniqueName: \"kubernetes.io/projected/95f89c56-40b6-4d4b-8060-154674b55a18-kube-api-access-hx9nl\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:06 crc kubenswrapper[4781]: E1208 20:18:06.990585 4781 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 08 20:18:06 crc kubenswrapper[4781]: E1208 20:18:06.990627 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-memberlist podName:95f89c56-40b6-4d4b-8060-154674b55a18 nodeName:}" failed. No retries permitted until 2025-12-08 20:18:07.490609592 +0000 UTC m=+803.641892969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-memberlist") pod "speaker-b4s7x" (UID: "95f89c56-40b6-4d4b-8060-154674b55a18") : secret "metallb-memberlist" not found Dec 08 20:18:06 crc kubenswrapper[4781]: E1208 20:18:06.990870 4781 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 08 20:18:06 crc kubenswrapper[4781]: E1208 20:18:06.990898 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-metrics-certs podName:95f89c56-40b6-4d4b-8060-154674b55a18 nodeName:}" failed. No retries permitted until 2025-12-08 20:18:07.49089126 +0000 UTC m=+803.642174637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-metrics-certs") pod "speaker-b4s7x" (UID: "95f89c56-40b6-4d4b-8060-154674b55a18") : secret "speaker-certs-secret" not found Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.990910 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/95f89c56-40b6-4d4b-8060-154674b55a18-metallb-excludel2\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.992304 4781 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 08 20:18:06 crc kubenswrapper[4781]: I1208 20:18:06.994339 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2c128a-e154-4685-957c-1cd19b86f113-metrics-certs\") pod \"controller-f8648f98b-k4scp\" (UID: \"ee2c128a-e154-4685-957c-1cd19b86f113\") " pod="metallb-system/controller-f8648f98b-k4scp" Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.006397 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee2c128a-e154-4685-957c-1cd19b86f113-cert\") pod \"controller-f8648f98b-k4scp\" (UID: \"ee2c128a-e154-4685-957c-1cd19b86f113\") " pod="metallb-system/controller-f8648f98b-k4scp" Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.031400 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps" Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.035943 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx9nl\" (UniqueName: \"kubernetes.io/projected/95f89c56-40b6-4d4b-8060-154674b55a18-kube-api-access-hx9nl\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.043616 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.090762 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpz2z\" (UniqueName: \"kubernetes.io/projected/ee2c128a-e154-4685-957c-1cd19b86f113-kube-api-access-kpz2z\") pod \"controller-f8648f98b-k4scp\" (UID: \"ee2c128a-e154-4685-957c-1cd19b86f113\") " pod="metallb-system/controller-f8648f98b-k4scp" Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.134370 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-k4scp" Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.501501 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-metrics-certs\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.501817 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-memberlist\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:07 crc kubenswrapper[4781]: E1208 20:18:07.501957 4781 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 08 20:18:07 crc kubenswrapper[4781]: E1208 20:18:07.502024 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-memberlist podName:95f89c56-40b6-4d4b-8060-154674b55a18 nodeName:}" failed. No retries permitted until 2025-12-08 20:18:08.502004569 +0000 UTC m=+804.653287936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-memberlist") pod "speaker-b4s7x" (UID: "95f89c56-40b6-4d4b-8060-154674b55a18") : secret "metallb-memberlist" not found Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.506960 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-metrics-certs\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.520761 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps"] Dec 08 20:18:07 crc kubenswrapper[4781]: W1208 20:18:07.528870 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17dc730c_3861_434c_aeed_de6287a1c55b.slice/crio-9ad2b4c2a3a9e6ba7d71b2a59e76deecc1ca69c402125f5be65d3df6b08fb66a WatchSource:0}: Error finding container 9ad2b4c2a3a9e6ba7d71b2a59e76deecc1ca69c402125f5be65d3df6b08fb66a: Status 404 returned error can't find the container with id 9ad2b4c2a3a9e6ba7d71b2a59e76deecc1ca69c402125f5be65d3df6b08fb66a Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.587512 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-k4scp"] Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.972612 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps" event={"ID":"17dc730c-3861-434c-aeed-de6287a1c55b","Type":"ContainerStarted","Data":"9ad2b4c2a3a9e6ba7d71b2a59e76deecc1ca69c402125f5be65d3df6b08fb66a"} Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.975291 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-k4scp" event={"ID":"ee2c128a-e154-4685-957c-1cd19b86f113","Type":"ContainerStarted","Data":"9d032e963582f1b0d0c86209bd3c8bf8d71e2ca3d1a13cd32a66c9cbf845ab1a"} Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.975348 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-k4scp" event={"ID":"ee2c128a-e154-4685-957c-1cd19b86f113","Type":"ContainerStarted","Data":"4842877f19f27756f28aa30d2427bb35bc32816d89b2ab0a7430598965773d9b"} Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.975363 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-k4scp" event={"ID":"ee2c128a-e154-4685-957c-1cd19b86f113","Type":"ContainerStarted","Data":"c1451ebf1e7bf91a6f01ceca7f09d647265d84f3824b9f5755aa251cbbadcba8"} Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.975473 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-k4scp" Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.976893 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-982hg" event={"ID":"8bf0dbea-2341-414b-89fc-e36f1150ee8e","Type":"ContainerStarted","Data":"db2c02418ac8ce89a50d90f0c3548f92dc9ed07bf8105226ee32d956a26c5a82"} Dec 08 20:18:07 crc kubenswrapper[4781]: I1208 20:18:07.989718 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-k4scp" podStartSLOduration=1.989695714 podStartE2EDuration="1.989695714s" podCreationTimestamp="2025-12-08 20:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:18:07.988942432 +0000 UTC m=+804.140225809" watchObservedRunningTime="2025-12-08 20:18:07.989695714 +0000 UTC m=+804.140979091" Dec 08 20:18:08 crc kubenswrapper[4781]: I1208 20:18:08.514156 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-memberlist\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:08 crc kubenswrapper[4781]: I1208 20:18:08.518969 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/95f89c56-40b6-4d4b-8060-154674b55a18-memberlist\") pod \"speaker-b4s7x\" (UID: \"95f89c56-40b6-4d4b-8060-154674b55a18\") " pod="metallb-system/speaker-b4s7x" Dec 08 20:18:08 crc kubenswrapper[4781]: I1208 20:18:08.607843 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-b4s7x" Dec 08 20:18:08 crc kubenswrapper[4781]: W1208 20:18:08.667562 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95f89c56_40b6_4d4b_8060_154674b55a18.slice/crio-4cc56ea7eada3e74c5a45322fb68d40aaa579d495883e94c5e6c77b0f2cb084e WatchSource:0}: Error finding container 4cc56ea7eada3e74c5a45322fb68d40aaa579d495883e94c5e6c77b0f2cb084e: Status 404 returned error can't find the container with id 4cc56ea7eada3e74c5a45322fb68d40aaa579d495883e94c5e6c77b0f2cb084e Dec 08 20:18:09 crc kubenswrapper[4781]: I1208 20:18:09.000279 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b4s7x" event={"ID":"95f89c56-40b6-4d4b-8060-154674b55a18","Type":"ContainerStarted","Data":"4cc56ea7eada3e74c5a45322fb68d40aaa579d495883e94c5e6c77b0f2cb084e"} Dec 08 20:18:10 crc kubenswrapper[4781]: I1208 20:18:10.014788 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b4s7x" event={"ID":"95f89c56-40b6-4d4b-8060-154674b55a18","Type":"ContainerStarted","Data":"6a05ff77f14122b1c7099bfa9b2268b86997b21d32415b25dc6bd66cb8ec14fd"} Dec 08 20:18:10 crc kubenswrapper[4781]: I1208 20:18:10.015231 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b4s7x" event={"ID":"95f89c56-40b6-4d4b-8060-154674b55a18","Type":"ContainerStarted","Data":"a2b67533664067857a0507f9e7a01f31041045200e5ab24e909075c75ecb1c41"} Dec 08 20:18:10 crc kubenswrapper[4781]: I1208 20:18:10.015249 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-b4s7x" Dec 08 20:18:10 crc kubenswrapper[4781]: I1208 20:18:10.034806 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-b4s7x" podStartSLOduration=4.034784816 podStartE2EDuration="4.034784816s" podCreationTimestamp="2025-12-08 20:18:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:18:10.030791811 +0000 UTC m=+806.182075188" watchObservedRunningTime="2025-12-08 20:18:10.034784816 +0000 UTC m=+806.186068203" Dec 08 20:18:15 crc kubenswrapper[4781]: I1208 20:18:15.109164 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps" event={"ID":"17dc730c-3861-434c-aeed-de6287a1c55b","Type":"ContainerStarted","Data":"d7e7b2e9c0d7c917c44055066bcc20d37ef38e881c754a9add58ce26ba6901ca"} Dec 08 20:18:15 crc kubenswrapper[4781]: I1208 20:18:15.109777 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps" Dec 08 20:18:15 crc kubenswrapper[4781]: I1208 20:18:15.112600 4781 generic.go:334] "Generic (PLEG): container finished" podID="8bf0dbea-2341-414b-89fc-e36f1150ee8e" containerID="922c36d19706bbebe7011af0d7222d5990125dcb09c210dbe84c2b82f627cc2b" exitCode=0 Dec 08 20:18:15 crc kubenswrapper[4781]: I1208 20:18:15.112646 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-982hg" event={"ID":"8bf0dbea-2341-414b-89fc-e36f1150ee8e","Type":"ContainerDied","Data":"922c36d19706bbebe7011af0d7222d5990125dcb09c210dbe84c2b82f627cc2b"} Dec 08 20:18:15 crc kubenswrapper[4781]: I1208 20:18:15.166526 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps" podStartSLOduration=2.670747613 podStartE2EDuration="9.166504336s" podCreationTimestamp="2025-12-08 20:18:06 +0000 UTC" firstStartedPulling="2025-12-08 20:18:07.531695613 +0000 UTC m=+803.682978990" lastFinishedPulling="2025-12-08 20:18:14.027452336 +0000 UTC m=+810.178735713" observedRunningTime="2025-12-08 20:18:15.13156389 +0000 UTC m=+811.282847267" watchObservedRunningTime="2025-12-08 20:18:15.166504336 +0000 UTC m=+811.317787723" Dec 08 20:18:16 crc kubenswrapper[4781]: I1208 20:18:16.125842 4781 generic.go:334] "Generic (PLEG): container finished" podID="8bf0dbea-2341-414b-89fc-e36f1150ee8e" containerID="c49ae39e75b1dd6e92fa28364bd4517eb529a17e08a035572f5f8245ce254128" exitCode=0 Dec 08 20:18:16 crc kubenswrapper[4781]: I1208 20:18:16.148182 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-982hg" event={"ID":"8bf0dbea-2341-414b-89fc-e36f1150ee8e","Type":"ContainerDied","Data":"c49ae39e75b1dd6e92fa28364bd4517eb529a17e08a035572f5f8245ce254128"} Dec 08 20:18:17 crc kubenswrapper[4781]: I1208 20:18:17.134385 4781 generic.go:334] "Generic (PLEG): container finished" podID="8bf0dbea-2341-414b-89fc-e36f1150ee8e" containerID="fb2da1d65247746e67ad8a7f6c4683c3f4c2a7a4177db2b1cac594250949ef6a" exitCode=0 Dec 08 20:18:17 crc kubenswrapper[4781]: I1208 20:18:17.134442 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-982hg" event={"ID":"8bf0dbea-2341-414b-89fc-e36f1150ee8e","Type":"ContainerDied","Data":"fb2da1d65247746e67ad8a7f6c4683c3f4c2a7a4177db2b1cac594250949ef6a"} Dec 08 20:18:17 crc kubenswrapper[4781]: I1208 20:18:17.139021 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-k4scp" Dec 08 20:18:18 crc kubenswrapper[4781]: I1208 20:18:18.144260 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-982hg" event={"ID":"8bf0dbea-2341-414b-89fc-e36f1150ee8e","Type":"ContainerStarted","Data":"b7470297d8fab3907b5bcc488e02c833398319f55c41ceca6ba8be035abe095b"} Dec 08 20:18:18 crc kubenswrapper[4781]: I1208 20:18:18.144567 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:18 crc kubenswrapper[4781]: I1208 20:18:18.144580 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-982hg" event={"ID":"8bf0dbea-2341-414b-89fc-e36f1150ee8e","Type":"ContainerStarted","Data":"d2666519ff6aab55f7000d54f03832bb2978e5d8a8a4ece989a5bbaeb0c72ccd"} Dec 08 20:18:18 crc kubenswrapper[4781]: I1208 20:18:18.144590 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-982hg" event={"ID":"8bf0dbea-2341-414b-89fc-e36f1150ee8e","Type":"ContainerStarted","Data":"d50eb7860d3b290e093975c7f52814dc25afd31bfb7c7a6304454f6c17ca236f"} Dec 08 20:18:18 crc kubenswrapper[4781]: I1208 20:18:18.144601 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-982hg" event={"ID":"8bf0dbea-2341-414b-89fc-e36f1150ee8e","Type":"ContainerStarted","Data":"a3dcae3137d889c7cbe231039475fe53eaf6432fffee78651d2c9bc4cf2201d4"} Dec 08 20:18:18 crc kubenswrapper[4781]: I1208 20:18:18.144610 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-982hg" event={"ID":"8bf0dbea-2341-414b-89fc-e36f1150ee8e","Type":"ContainerStarted","Data":"158dcde81169bb3d7932a91608573c9b9ebe8897936074cb57aa4b670f1242a8"} Dec 08 20:18:18 crc kubenswrapper[4781]: I1208 20:18:18.144618 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-982hg" event={"ID":"8bf0dbea-2341-414b-89fc-e36f1150ee8e","Type":"ContainerStarted","Data":"9bb1362c20a60d16e3f4e251dbd7b40141a4c2f80a22d57d3883585a11e86e78"} Dec 08 20:18:18 crc kubenswrapper[4781]: I1208 20:18:18.165722 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-982hg" podStartSLOduration=5.332594774 podStartE2EDuration="12.165702505s" podCreationTimestamp="2025-12-08 20:18:06 +0000 UTC" firstStartedPulling="2025-12-08 20:18:07.194332505 +0000 UTC m=+803.345615882" lastFinishedPulling="2025-12-08 20:18:14.027440246 +0000 UTC m=+810.178723613" observedRunningTime="2025-12-08 20:18:18.161884715 +0000 UTC m=+814.313168102" watchObservedRunningTime="2025-12-08 20:18:18.165702505 +0000 UTC m=+814.316985882" Dec 08 20:18:22 crc kubenswrapper[4781]: I1208 20:18:22.045418 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:22 crc kubenswrapper[4781]: I1208 20:18:22.079723 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:27 crc kubenswrapper[4781]: I1208 20:18:27.036064 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lwsps" Dec 08 20:18:27 crc kubenswrapper[4781]: I1208 20:18:27.047221 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-982hg" Dec 08 20:18:28 crc kubenswrapper[4781]: I1208 20:18:28.611286 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-b4s7x" Dec 08 20:18:29 crc kubenswrapper[4781]: I1208 20:18:29.948677 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:18:29 crc kubenswrapper[4781]: I1208 20:18:29.949194 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:18:31 crc kubenswrapper[4781]: I1208 20:18:31.720124 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rh2lk"] Dec 08 20:18:31 crc kubenswrapper[4781]: I1208 20:18:31.721029 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rh2lk" Dec 08 20:18:31 crc kubenswrapper[4781]: I1208 20:18:31.724127 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-rspkn" Dec 08 20:18:31 crc kubenswrapper[4781]: I1208 20:18:31.724952 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 08 20:18:31 crc kubenswrapper[4781]: I1208 20:18:31.725348 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 08 20:18:31 crc kubenswrapper[4781]: I1208 20:18:31.740544 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pfs8\" (UniqueName: \"kubernetes.io/projected/f012243e-0cc8-482e-922a-b81ba9644786-kube-api-access-4pfs8\") pod \"openstack-operator-index-rh2lk\" (UID: \"f012243e-0cc8-482e-922a-b81ba9644786\") " pod="openstack-operators/openstack-operator-index-rh2lk" Dec 08 20:18:31 crc kubenswrapper[4781]: I1208 20:18:31.745516 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rh2lk"] Dec 08 20:18:31 crc kubenswrapper[4781]: I1208 20:18:31.841863 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pfs8\" (UniqueName: \"kubernetes.io/projected/f012243e-0cc8-482e-922a-b81ba9644786-kube-api-access-4pfs8\") pod \"openstack-operator-index-rh2lk\" (UID: \"f012243e-0cc8-482e-922a-b81ba9644786\") " pod="openstack-operators/openstack-operator-index-rh2lk" Dec 08 20:18:31 crc kubenswrapper[4781]: I1208 20:18:31.859577 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pfs8\" (UniqueName: \"kubernetes.io/projected/f012243e-0cc8-482e-922a-b81ba9644786-kube-api-access-4pfs8\") pod \"openstack-operator-index-rh2lk\" (UID: \"f012243e-0cc8-482e-922a-b81ba9644786\") " pod="openstack-operators/openstack-operator-index-rh2lk" Dec 08 20:18:32 crc kubenswrapper[4781]: I1208 20:18:32.042505 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rh2lk" Dec 08 20:18:32 crc kubenswrapper[4781]: I1208 20:18:32.238235 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rh2lk"] Dec 08 20:18:32 crc kubenswrapper[4781]: W1208 20:18:32.246506 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf012243e_0cc8_482e_922a_b81ba9644786.slice/crio-c122ee9a411e85881316928e0f65f776b54dcb495999f0f2fb64b5de10a49f80 WatchSource:0}: Error finding container c122ee9a411e85881316928e0f65f776b54dcb495999f0f2fb64b5de10a49f80: Status 404 returned error can't find the container with id c122ee9a411e85881316928e0f65f776b54dcb495999f0f2fb64b5de10a49f80 Dec 08 20:18:33 crc kubenswrapper[4781]: I1208 20:18:33.240703 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rh2lk" event={"ID":"f012243e-0cc8-482e-922a-b81ba9644786","Type":"ContainerStarted","Data":"c122ee9a411e85881316928e0f65f776b54dcb495999f0f2fb64b5de10a49f80"} Dec 08 20:18:35 crc kubenswrapper[4781]: I1208 20:18:35.089027 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rh2lk"] Dec 08 20:18:35 crc kubenswrapper[4781]: I1208 20:18:35.257012 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rh2lk" event={"ID":"f012243e-0cc8-482e-922a-b81ba9644786","Type":"ContainerStarted","Data":"bbe501500c69e972f96e8db1a2a4805c29921278b248cbefdcd13ea470510e04"} Dec 08 20:18:35 crc kubenswrapper[4781]: I1208 20:18:35.284553 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rh2lk" podStartSLOduration=2.359588581 podStartE2EDuration="4.284527686s" podCreationTimestamp="2025-12-08 20:18:31 +0000 UTC" firstStartedPulling="2025-12-08 20:18:32.250505975 +0000 UTC m=+828.401789352" lastFinishedPulling="2025-12-08 20:18:34.17544508 +0000 UTC m=+830.326728457" observedRunningTime="2025-12-08 20:18:35.276540006 +0000 UTC m=+831.427823393" watchObservedRunningTime="2025-12-08 20:18:35.284527686 +0000 UTC m=+831.435811063" Dec 08 20:18:35 crc kubenswrapper[4781]: I1208 20:18:35.695880 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-924xv"] Dec 08 20:18:35 crc kubenswrapper[4781]: I1208 20:18:35.696843 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-924xv" Dec 08 20:18:35 crc kubenswrapper[4781]: I1208 20:18:35.709819 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-924xv"] Dec 08 20:18:35 crc kubenswrapper[4781]: I1208 20:18:35.891213 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxsgj\" (UniqueName: \"kubernetes.io/projected/fa455afd-33f9-4f97-9eb8-838444176453-kube-api-access-zxsgj\") pod \"openstack-operator-index-924xv\" (UID: \"fa455afd-33f9-4f97-9eb8-838444176453\") " pod="openstack-operators/openstack-operator-index-924xv" Dec 08 20:18:35 crc kubenswrapper[4781]: I1208 20:18:35.992277 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxsgj\" (UniqueName: \"kubernetes.io/projected/fa455afd-33f9-4f97-9eb8-838444176453-kube-api-access-zxsgj\") pod \"openstack-operator-index-924xv\" (UID: \"fa455afd-33f9-4f97-9eb8-838444176453\") " pod="openstack-operators/openstack-operator-index-924xv" Dec 08 20:18:36 crc kubenswrapper[4781]: I1208 20:18:36.016414 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxsgj\" (UniqueName: \"kubernetes.io/projected/fa455afd-33f9-4f97-9eb8-838444176453-kube-api-access-zxsgj\") pod \"openstack-operator-index-924xv\" (UID: \"fa455afd-33f9-4f97-9eb8-838444176453\") " pod="openstack-operators/openstack-operator-index-924xv" Dec 08 20:18:36 crc kubenswrapper[4781]: I1208 20:18:36.262447 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rh2lk" podUID="f012243e-0cc8-482e-922a-b81ba9644786" containerName="registry-server" containerID="cri-o://bbe501500c69e972f96e8db1a2a4805c29921278b248cbefdcd13ea470510e04" gracePeriod=2 Dec 08 20:18:36 crc kubenswrapper[4781]: I1208 20:18:36.316245 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-924xv" Dec 08 20:18:36 crc kubenswrapper[4781]: I1208 20:18:36.644728 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rh2lk" Dec 08 20:18:36 crc kubenswrapper[4781]: I1208 20:18:36.735812 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-924xv"] Dec 08 20:18:36 crc kubenswrapper[4781]: W1208 20:18:36.740436 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa455afd_33f9_4f97_9eb8_838444176453.slice/crio-126b6d742298b983f9b4a57092ea70f422cbe57ade3c51ce74d084c9286ec6e6 WatchSource:0}: Error finding container 126b6d742298b983f9b4a57092ea70f422cbe57ade3c51ce74d084c9286ec6e6: Status 404 returned error can't find the container with id 126b6d742298b983f9b4a57092ea70f422cbe57ade3c51ce74d084c9286ec6e6 Dec 08 20:18:36 crc kubenswrapper[4781]: I1208 20:18:36.803607 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pfs8\" (UniqueName: \"kubernetes.io/projected/f012243e-0cc8-482e-922a-b81ba9644786-kube-api-access-4pfs8\") pod \"f012243e-0cc8-482e-922a-b81ba9644786\" (UID: \"f012243e-0cc8-482e-922a-b81ba9644786\") " Dec 08 20:18:36 crc kubenswrapper[4781]: I1208 20:18:36.809748 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f012243e-0cc8-482e-922a-b81ba9644786-kube-api-access-4pfs8" (OuterVolumeSpecName: "kube-api-access-4pfs8") pod "f012243e-0cc8-482e-922a-b81ba9644786" (UID: "f012243e-0cc8-482e-922a-b81ba9644786"). InnerVolumeSpecName "kube-api-access-4pfs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:18:36 crc kubenswrapper[4781]: I1208 20:18:36.906024 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pfs8\" (UniqueName: \"kubernetes.io/projected/f012243e-0cc8-482e-922a-b81ba9644786-kube-api-access-4pfs8\") on node \"crc\" DevicePath \"\"" Dec 08 20:18:37 crc kubenswrapper[4781]: I1208 20:18:37.270967 4781 generic.go:334] "Generic (PLEG): container finished" podID="f012243e-0cc8-482e-922a-b81ba9644786" containerID="bbe501500c69e972f96e8db1a2a4805c29921278b248cbefdcd13ea470510e04" exitCode=0 Dec 08 20:18:37 crc kubenswrapper[4781]: I1208 20:18:37.271035 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rh2lk" event={"ID":"f012243e-0cc8-482e-922a-b81ba9644786","Type":"ContainerDied","Data":"bbe501500c69e972f96e8db1a2a4805c29921278b248cbefdcd13ea470510e04"} Dec 08 20:18:37 crc kubenswrapper[4781]: I1208 20:18:37.271065 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rh2lk" event={"ID":"f012243e-0cc8-482e-922a-b81ba9644786","Type":"ContainerDied","Data":"c122ee9a411e85881316928e0f65f776b54dcb495999f0f2fb64b5de10a49f80"} Dec 08 20:18:37 crc kubenswrapper[4781]: I1208 20:18:37.271084 4781 scope.go:117] "RemoveContainer" containerID="bbe501500c69e972f96e8db1a2a4805c29921278b248cbefdcd13ea470510e04" Dec 08 20:18:37 crc kubenswrapper[4781]: I1208 20:18:37.271183 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rh2lk" Dec 08 20:18:37 crc kubenswrapper[4781]: I1208 20:18:37.273231 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-924xv" event={"ID":"fa455afd-33f9-4f97-9eb8-838444176453","Type":"ContainerStarted","Data":"126b6d742298b983f9b4a57092ea70f422cbe57ade3c51ce74d084c9286ec6e6"} Dec 08 20:18:37 crc kubenswrapper[4781]: I1208 20:18:37.296734 4781 scope.go:117] "RemoveContainer" containerID="bbe501500c69e972f96e8db1a2a4805c29921278b248cbefdcd13ea470510e04" Dec 08 20:18:37 crc kubenswrapper[4781]: E1208 20:18:37.297576 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbe501500c69e972f96e8db1a2a4805c29921278b248cbefdcd13ea470510e04\": container with ID starting with bbe501500c69e972f96e8db1a2a4805c29921278b248cbefdcd13ea470510e04 not found: ID does not exist" containerID="bbe501500c69e972f96e8db1a2a4805c29921278b248cbefdcd13ea470510e04" Dec 08 20:18:37 crc kubenswrapper[4781]: I1208 20:18:37.297827 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe501500c69e972f96e8db1a2a4805c29921278b248cbefdcd13ea470510e04"} err="failed to get container status \"bbe501500c69e972f96e8db1a2a4805c29921278b248cbefdcd13ea470510e04\": rpc error: code = NotFound desc = could not find container \"bbe501500c69e972f96e8db1a2a4805c29921278b248cbefdcd13ea470510e04\": container with ID starting with bbe501500c69e972f96e8db1a2a4805c29921278b248cbefdcd13ea470510e04 not found: ID does not exist" Dec 08 20:18:37 crc kubenswrapper[4781]: I1208 20:18:37.306526 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rh2lk"] Dec 08 20:18:37 crc kubenswrapper[4781]: I1208 20:18:37.318168 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rh2lk"] Dec 08 20:18:38 crc kubenswrapper[4781]: I1208 20:18:38.134750 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f012243e-0cc8-482e-922a-b81ba9644786" path="/var/lib/kubelet/pods/f012243e-0cc8-482e-922a-b81ba9644786/volumes" Dec 08 20:18:38 crc kubenswrapper[4781]: I1208 20:18:38.283633 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-924xv" event={"ID":"fa455afd-33f9-4f97-9eb8-838444176453","Type":"ContainerStarted","Data":"2a59b5b197133cd4549aea9ed3edec56d653916cc2f2f42f933ca420b81d2080"} Dec 08 20:18:38 crc kubenswrapper[4781]: I1208 20:18:38.303656 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-924xv" podStartSLOduration=2.8966075460000003 podStartE2EDuration="3.30363925s" podCreationTimestamp="2025-12-08 20:18:35 +0000 UTC" firstStartedPulling="2025-12-08 20:18:36.744386289 +0000 UTC m=+832.895669676" lastFinishedPulling="2025-12-08 20:18:37.151418003 +0000 UTC m=+833.302701380" observedRunningTime="2025-12-08 20:18:38.301357865 +0000 UTC m=+834.452641242" watchObservedRunningTime="2025-12-08 20:18:38.30363925 +0000 UTC m=+834.454922627" Dec 08 20:18:46 crc kubenswrapper[4781]: I1208 20:18:46.317042 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-924xv" Dec 08 20:18:46 crc kubenswrapper[4781]: I1208 20:18:46.317683 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-924xv" Dec 08 20:18:46 crc kubenswrapper[4781]: I1208 20:18:46.345132 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-924xv" Dec 08 20:18:46 crc kubenswrapper[4781]: I1208 20:18:46.377199 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-924xv" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.555807 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2"] Dec 08 20:18:52 crc kubenswrapper[4781]: E1208 20:18:52.557535 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f012243e-0cc8-482e-922a-b81ba9644786" containerName="registry-server" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.557625 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f012243e-0cc8-482e-922a-b81ba9644786" containerName="registry-server" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.557806 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f012243e-0cc8-482e-922a-b81ba9644786" containerName="registry-server" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.558703 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.561008 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-9pjkf" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.569744 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2"] Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.612466 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a0379ef-fb54-4ec0-b240-fbe55e702817-util\") pod \"b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2\" (UID: \"5a0379ef-fb54-4ec0-b240-fbe55e702817\") " pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.612532 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a0379ef-fb54-4ec0-b240-fbe55e702817-bundle\") pod \"b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2\" (UID: \"5a0379ef-fb54-4ec0-b240-fbe55e702817\") " pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.612581 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdmv6\" (UniqueName: \"kubernetes.io/projected/5a0379ef-fb54-4ec0-b240-fbe55e702817-kube-api-access-sdmv6\") pod \"b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2\" (UID: \"5a0379ef-fb54-4ec0-b240-fbe55e702817\") " pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.714287 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdmv6\" (UniqueName: \"kubernetes.io/projected/5a0379ef-fb54-4ec0-b240-fbe55e702817-kube-api-access-sdmv6\") pod \"b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2\" (UID: \"5a0379ef-fb54-4ec0-b240-fbe55e702817\") " pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.714468 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a0379ef-fb54-4ec0-b240-fbe55e702817-util\") pod \"b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2\" (UID: \"5a0379ef-fb54-4ec0-b240-fbe55e702817\") " pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.714509 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a0379ef-fb54-4ec0-b240-fbe55e702817-bundle\") pod \"b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2\" (UID: \"5a0379ef-fb54-4ec0-b240-fbe55e702817\") " pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.715339 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a0379ef-fb54-4ec0-b240-fbe55e702817-util\") pod \"b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2\" (UID: \"5a0379ef-fb54-4ec0-b240-fbe55e702817\") " pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.715421 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a0379ef-fb54-4ec0-b240-fbe55e702817-bundle\") pod \"b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2\" (UID: \"5a0379ef-fb54-4ec0-b240-fbe55e702817\") " pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.743992 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdmv6\" (UniqueName: \"kubernetes.io/projected/5a0379ef-fb54-4ec0-b240-fbe55e702817-kube-api-access-sdmv6\") pod \"b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2\" (UID: \"5a0379ef-fb54-4ec0-b240-fbe55e702817\") " pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" Dec 08 20:18:52 crc kubenswrapper[4781]: I1208 20:18:52.883281 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" Dec 08 20:18:53 crc kubenswrapper[4781]: I1208 20:18:53.309592 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2"] Dec 08 20:18:53 crc kubenswrapper[4781]: W1208 20:18:53.312473 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a0379ef_fb54_4ec0_b240_fbe55e702817.slice/crio-9cb99e2e39d56b6aa064f68c1d296ff467bed1631ef24a558516c0489d6e2862 WatchSource:0}: Error finding container 9cb99e2e39d56b6aa064f68c1d296ff467bed1631ef24a558516c0489d6e2862: Status 404 returned error can't find the container with id 9cb99e2e39d56b6aa064f68c1d296ff467bed1631ef24a558516c0489d6e2862 Dec 08 20:18:53 crc kubenswrapper[4781]: I1208 20:18:53.384096 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" event={"ID":"5a0379ef-fb54-4ec0-b240-fbe55e702817","Type":"ContainerStarted","Data":"9cb99e2e39d56b6aa064f68c1d296ff467bed1631ef24a558516c0489d6e2862"} Dec 08 20:18:54 crc kubenswrapper[4781]: I1208 20:18:54.393340 4781 generic.go:334] "Generic (PLEG): container finished" podID="5a0379ef-fb54-4ec0-b240-fbe55e702817" containerID="a5a170c5c38f8f7e2137efd2a578624ddc86ee264d835687522b05baca13ef90" exitCode=0 Dec 08 20:18:54 crc kubenswrapper[4781]: I1208 20:18:54.393388 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" event={"ID":"5a0379ef-fb54-4ec0-b240-fbe55e702817","Type":"ContainerDied","Data":"a5a170c5c38f8f7e2137efd2a578624ddc86ee264d835687522b05baca13ef90"} Dec 08 20:18:55 crc kubenswrapper[4781]: I1208 20:18:55.399559 4781 generic.go:334] "Generic (PLEG): container finished" podID="5a0379ef-fb54-4ec0-b240-fbe55e702817" containerID="c6c914f21654062b37e93cd01dce062f28c757c9ad219e3c9c77a2f705284987" exitCode=0 Dec 08 20:18:55 crc kubenswrapper[4781]: I1208 20:18:55.399599 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" event={"ID":"5a0379ef-fb54-4ec0-b240-fbe55e702817","Type":"ContainerDied","Data":"c6c914f21654062b37e93cd01dce062f28c757c9ad219e3c9c77a2f705284987"} Dec 08 20:18:56 crc kubenswrapper[4781]: I1208 20:18:56.409167 4781 generic.go:334] "Generic (PLEG): container finished" podID="5a0379ef-fb54-4ec0-b240-fbe55e702817" containerID="eb206bc958d1f92e0b46603c735c0435af339ff2f4837be2c832586b32a849f2" exitCode=0 Dec 08 20:18:56 crc kubenswrapper[4781]: I1208 20:18:56.409227 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" event={"ID":"5a0379ef-fb54-4ec0-b240-fbe55e702817","Type":"ContainerDied","Data":"eb206bc958d1f92e0b46603c735c0435af339ff2f4837be2c832586b32a849f2"} Dec 08 20:18:57 crc kubenswrapper[4781]: I1208 20:18:57.703976 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" Dec 08 20:18:57 crc kubenswrapper[4781]: I1208 20:18:57.779428 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a0379ef-fb54-4ec0-b240-fbe55e702817-bundle\") pod \"5a0379ef-fb54-4ec0-b240-fbe55e702817\" (UID: \"5a0379ef-fb54-4ec0-b240-fbe55e702817\") " Dec 08 20:18:57 crc kubenswrapper[4781]: I1208 20:18:57.779504 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a0379ef-fb54-4ec0-b240-fbe55e702817-util\") pod \"5a0379ef-fb54-4ec0-b240-fbe55e702817\" (UID: \"5a0379ef-fb54-4ec0-b240-fbe55e702817\") " Dec 08 20:18:57 crc kubenswrapper[4781]: I1208 20:18:57.779598 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdmv6\" (UniqueName: \"kubernetes.io/projected/5a0379ef-fb54-4ec0-b240-fbe55e702817-kube-api-access-sdmv6\") pod \"5a0379ef-fb54-4ec0-b240-fbe55e702817\" (UID: \"5a0379ef-fb54-4ec0-b240-fbe55e702817\") " Dec 08 20:18:57 crc kubenswrapper[4781]: I1208 20:18:57.781187 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a0379ef-fb54-4ec0-b240-fbe55e702817-bundle" (OuterVolumeSpecName: "bundle") pod "5a0379ef-fb54-4ec0-b240-fbe55e702817" (UID: "5a0379ef-fb54-4ec0-b240-fbe55e702817"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:18:57 crc kubenswrapper[4781]: I1208 20:18:57.786167 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a0379ef-fb54-4ec0-b240-fbe55e702817-kube-api-access-sdmv6" (OuterVolumeSpecName: "kube-api-access-sdmv6") pod "5a0379ef-fb54-4ec0-b240-fbe55e702817" (UID: "5a0379ef-fb54-4ec0-b240-fbe55e702817"). InnerVolumeSpecName "kube-api-access-sdmv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:18:57 crc kubenswrapper[4781]: I1208 20:18:57.794639 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a0379ef-fb54-4ec0-b240-fbe55e702817-util" (OuterVolumeSpecName: "util") pod "5a0379ef-fb54-4ec0-b240-fbe55e702817" (UID: "5a0379ef-fb54-4ec0-b240-fbe55e702817"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:18:57 crc kubenswrapper[4781]: I1208 20:18:57.880866 4781 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a0379ef-fb54-4ec0-b240-fbe55e702817-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:18:57 crc kubenswrapper[4781]: I1208 20:18:57.880906 4781 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a0379ef-fb54-4ec0-b240-fbe55e702817-util\") on node \"crc\" DevicePath \"\"" Dec 08 20:18:57 crc kubenswrapper[4781]: I1208 20:18:57.880947 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdmv6\" (UniqueName: \"kubernetes.io/projected/5a0379ef-fb54-4ec0-b240-fbe55e702817-kube-api-access-sdmv6\") on node \"crc\" DevicePath \"\"" Dec 08 20:18:58 crc kubenswrapper[4781]: I1208 20:18:58.423667 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" event={"ID":"5a0379ef-fb54-4ec0-b240-fbe55e702817","Type":"ContainerDied","Data":"9cb99e2e39d56b6aa064f68c1d296ff467bed1631ef24a558516c0489d6e2862"} Dec 08 20:18:58 crc kubenswrapper[4781]: I1208 20:18:58.423708 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cb99e2e39d56b6aa064f68c1d296ff467bed1631ef24a558516c0489d6e2862" Dec 08 20:18:58 crc kubenswrapper[4781]: I1208 20:18:58.423749 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2" Dec 08 20:18:59 crc kubenswrapper[4781]: I1208 20:18:59.948241 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:18:59 crc kubenswrapper[4781]: I1208 20:18:59.948588 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:18:59 crc kubenswrapper[4781]: I1208 20:18:59.948636 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:18:59 crc kubenswrapper[4781]: I1208 20:18:59.949317 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"771a2f271b567bb174dbc7c73044708946a8faab42d018d48f6894347bef1ff3"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 20:18:59 crc kubenswrapper[4781]: I1208 20:18:59.949390 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://771a2f271b567bb174dbc7c73044708946a8faab42d018d48f6894347bef1ff3" gracePeriod=600 Dec 08 20:19:00 crc kubenswrapper[4781]: I1208 20:19:00.437849 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="771a2f271b567bb174dbc7c73044708946a8faab42d018d48f6894347bef1ff3" exitCode=0 Dec 08 20:19:00 crc kubenswrapper[4781]: I1208 20:19:00.437951 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"771a2f271b567bb174dbc7c73044708946a8faab42d018d48f6894347bef1ff3"} Dec 08 20:19:00 crc kubenswrapper[4781]: I1208 20:19:00.438450 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"25df9dbfafc8a5164a8f6020132a91b9381bc39dadc7e73659feabffa41e871a"} Dec 08 20:19:00 crc kubenswrapper[4781]: I1208 20:19:00.438545 4781 scope.go:117] "RemoveContainer" containerID="dfd2ead44684e651bc28c73f98c7b6fe01726a0e85f06c1edad8116b86010483" Dec 08 20:19:04 crc kubenswrapper[4781]: I1208 20:19:04.644153 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5f48db4cb9-gl7xk"] Dec 08 20:19:04 crc kubenswrapper[4781]: E1208 20:19:04.644851 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0379ef-fb54-4ec0-b240-fbe55e702817" containerName="util" Dec 08 20:19:04 crc kubenswrapper[4781]: I1208 20:19:04.644865 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0379ef-fb54-4ec0-b240-fbe55e702817" containerName="util" Dec 08 20:19:04 crc kubenswrapper[4781]: E1208 20:19:04.644882 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0379ef-fb54-4ec0-b240-fbe55e702817" containerName="extract" Dec 08 20:19:04 crc kubenswrapper[4781]: I1208 20:19:04.644890 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0379ef-fb54-4ec0-b240-fbe55e702817" containerName="extract" Dec 08 20:19:04 crc kubenswrapper[4781]: E1208 20:19:04.644902 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0379ef-fb54-4ec0-b240-fbe55e702817" containerName="pull" Dec 08 20:19:04 crc kubenswrapper[4781]: I1208 20:19:04.644911 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0379ef-fb54-4ec0-b240-fbe55e702817" containerName="pull" Dec 08 20:19:04 crc kubenswrapper[4781]: I1208 20:19:04.645060 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a0379ef-fb54-4ec0-b240-fbe55e702817" containerName="extract" Dec 08 20:19:04 crc kubenswrapper[4781]: I1208 20:19:04.645513 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5f48db4cb9-gl7xk" Dec 08 20:19:04 crc kubenswrapper[4781]: I1208 20:19:04.666078 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-fkrqq" Dec 08 20:19:04 crc kubenswrapper[4781]: I1208 20:19:04.693382 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lw4b\" (UniqueName: \"kubernetes.io/projected/e3c328d0-5fc0-4900-b90b-0b89bf486395-kube-api-access-8lw4b\") pod \"openstack-operator-controller-operator-5f48db4cb9-gl7xk\" (UID: \"e3c328d0-5fc0-4900-b90b-0b89bf486395\") " pod="openstack-operators/openstack-operator-controller-operator-5f48db4cb9-gl7xk" Dec 08 20:19:04 crc kubenswrapper[4781]: I1208 20:19:04.710841 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5f48db4cb9-gl7xk"] Dec 08 20:19:04 crc kubenswrapper[4781]: I1208 20:19:04.794193 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lw4b\" (UniqueName: \"kubernetes.io/projected/e3c328d0-5fc0-4900-b90b-0b89bf486395-kube-api-access-8lw4b\") pod \"openstack-operator-controller-operator-5f48db4cb9-gl7xk\" (UID: \"e3c328d0-5fc0-4900-b90b-0b89bf486395\") " pod="openstack-operators/openstack-operator-controller-operator-5f48db4cb9-gl7xk" Dec 08 20:19:04 crc kubenswrapper[4781]: I1208 20:19:04.814757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lw4b\" (UniqueName: \"kubernetes.io/projected/e3c328d0-5fc0-4900-b90b-0b89bf486395-kube-api-access-8lw4b\") pod \"openstack-operator-controller-operator-5f48db4cb9-gl7xk\" (UID: \"e3c328d0-5fc0-4900-b90b-0b89bf486395\") " pod="openstack-operators/openstack-operator-controller-operator-5f48db4cb9-gl7xk" Dec 08 20:19:04 crc kubenswrapper[4781]: I1208 20:19:04.969440 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5f48db4cb9-gl7xk" Dec 08 20:19:05 crc kubenswrapper[4781]: I1208 20:19:05.184512 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5f48db4cb9-gl7xk"] Dec 08 20:19:05 crc kubenswrapper[4781]: I1208 20:19:05.468741 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5f48db4cb9-gl7xk" event={"ID":"e3c328d0-5fc0-4900-b90b-0b89bf486395","Type":"ContainerStarted","Data":"ecda623b27a5251b566cfaf6a007374e8570d33c483fb2050aba49754dc15e93"} Dec 08 20:19:10 crc kubenswrapper[4781]: I1208 20:19:10.508598 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5f48db4cb9-gl7xk" event={"ID":"e3c328d0-5fc0-4900-b90b-0b89bf486395","Type":"ContainerStarted","Data":"cc5fb37b5a94549e7b2cd698530093d3fea77fd68143f5df3405d7f7522ce3dd"} Dec 08 20:19:10 crc kubenswrapper[4781]: I1208 20:19:10.509186 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5f48db4cb9-gl7xk" Dec 08 20:19:10 crc kubenswrapper[4781]: I1208 20:19:10.539405 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5f48db4cb9-gl7xk" podStartSLOduration=2.1513823739999998 podStartE2EDuration="6.539384391s" podCreationTimestamp="2025-12-08 20:19:04 +0000 UTC" firstStartedPulling="2025-12-08 20:19:05.196465174 +0000 UTC m=+861.347748551" lastFinishedPulling="2025-12-08 20:19:09.584467191 +0000 UTC m=+865.735750568" observedRunningTime="2025-12-08 20:19:10.533841261 +0000 UTC m=+866.685124658" watchObservedRunningTime="2025-12-08 20:19:10.539384391 +0000 UTC m=+866.690667768" Dec 08 20:19:14 crc kubenswrapper[4781]: I1208 20:19:14.972406 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5f48db4cb9-gl7xk" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.106572 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.109482 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.112145 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.112790 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-p2kzd" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.113016 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.114890 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-9zwtq" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.123985 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.131590 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.144327 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.145205 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.151130 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-p5shf" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.158319 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.159244 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.160859 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-jqxlh" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.164799 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.195895 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.214638 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4swk\" (UniqueName: \"kubernetes.io/projected/17686434-377d-4a2f-b25e-e0074d2e06c6-kube-api-access-f4swk\") pod \"cinder-operator-controller-manager-6c677c69b-tk48g\" (UID: \"17686434-377d-4a2f-b25e-e0074d2e06c6\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.214693 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6krf\" (UniqueName: \"kubernetes.io/projected/c1ed3c21-a6cb-43f6-a018-8bead69a5439-kube-api-access-p6krf\") pod \"designate-operator-controller-manager-697fb699cf-cg5z2\" (UID: \"c1ed3c21-a6cb-43f6-a018-8bead69a5439\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.214769 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj6kr\" (UniqueName: \"kubernetes.io/projected/b44e4d42-05a5-42e2-8a45-5d5506fbbb23-kube-api-access-wj6kr\") pod \"glance-operator-controller-manager-5697bb5779-64kcb\" (UID: \"b44e4d42-05a5-42e2-8a45-5d5506fbbb23\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.214817 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2s5c\" (UniqueName: \"kubernetes.io/projected/33445aa2-3e5b-4b50-ba7a-0f86d08dd64d-kube-api-access-s2s5c\") pod \"barbican-operator-controller-manager-7d9dfd778-spkzc\" (UID: \"33445aa2-3e5b-4b50-ba7a-0f86d08dd64d\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.215851 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.217250 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.220187 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.224100 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-rn5nk" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.235890 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.236873 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.239042 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-5gg4n" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.242041 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.243341 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.248333 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.257500 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.257867 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-cz5fj" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.270217 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-cv24b"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.294802 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-cv24b" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.301764 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-cljqt" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.319086 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.319225 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2s5c\" (UniqueName: \"kubernetes.io/projected/33445aa2-3e5b-4b50-ba7a-0f86d08dd64d-kube-api-access-s2s5c\") pod \"barbican-operator-controller-manager-7d9dfd778-spkzc\" (UID: \"33445aa2-3e5b-4b50-ba7a-0f86d08dd64d\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.319672 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5mbrr\" (UID: \"47e01596-c50b-44f5-82fb-1b6c7a005d10\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.319845 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2c89\" (UniqueName: \"kubernetes.io/projected/47e01596-c50b-44f5-82fb-1b6c7a005d10-kube-api-access-m2c89\") pod \"infra-operator-controller-manager-78d48bff9d-5mbrr\" (UID: \"47e01596-c50b-44f5-82fb-1b6c7a005d10\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.320011 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4swk\" (UniqueName: \"kubernetes.io/projected/17686434-377d-4a2f-b25e-e0074d2e06c6-kube-api-access-f4swk\") pod \"cinder-operator-controller-manager-6c677c69b-tk48g\" (UID: \"17686434-377d-4a2f-b25e-e0074d2e06c6\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.320177 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf4z2\" (UniqueName: \"kubernetes.io/projected/5e629e44-f8b6-410a-baa9-b076e609686c-kube-api-access-zf4z2\") pod \"heat-operator-controller-manager-5f64f6f8bb-7dhp5\" (UID: \"5e629e44-f8b6-410a-baa9-b076e609686c\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.320726 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6krf\" (UniqueName: \"kubernetes.io/projected/c1ed3c21-a6cb-43f6-a018-8bead69a5439-kube-api-access-p6krf\") pod \"designate-operator-controller-manager-697fb699cf-cg5z2\" (UID: \"c1ed3c21-a6cb-43f6-a018-8bead69a5439\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.320878 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvs2n\" (UniqueName: \"kubernetes.io/projected/97f1308f-60a9-4e7f-b029-8bc13246ba9e-kube-api-access-qvs2n\") pod \"ironic-operator-controller-manager-967d97867-cv24b\" (UID: \"97f1308f-60a9-4e7f-b029-8bc13246ba9e\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-cv24b" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.321105 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4pnv\" (UniqueName: \"kubernetes.io/projected/76fef842-95b4-47f1-9c34-a4edc70a3cbf-kube-api-access-j4pnv\") pod \"horizon-operator-controller-manager-68c6d99b8f-bkrvn\" (UID: \"76fef842-95b4-47f1-9c34-a4edc70a3cbf\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.321245 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj6kr\" (UniqueName: \"kubernetes.io/projected/b44e4d42-05a5-42e2-8a45-5d5506fbbb23-kube-api-access-wj6kr\") pod \"glance-operator-controller-manager-5697bb5779-64kcb\" (UID: \"b44e4d42-05a5-42e2-8a45-5d5506fbbb23\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.373389 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-cv24b"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.377635 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6krf\" (UniqueName: \"kubernetes.io/projected/c1ed3c21-a6cb-43f6-a018-8bead69a5439-kube-api-access-p6krf\") pod \"designate-operator-controller-manager-697fb699cf-cg5z2\" (UID: \"c1ed3c21-a6cb-43f6-a018-8bead69a5439\") " pod="openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.377635 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4swk\" (UniqueName: \"kubernetes.io/projected/17686434-377d-4a2f-b25e-e0074d2e06c6-kube-api-access-f4swk\") pod \"cinder-operator-controller-manager-6c677c69b-tk48g\" (UID: \"17686434-377d-4a2f-b25e-e0074d2e06c6\") " pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.383339 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.385535 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.387543 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-twhpb" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.389136 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj6kr\" (UniqueName: \"kubernetes.io/projected/b44e4d42-05a5-42e2-8a45-5d5506fbbb23-kube-api-access-wj6kr\") pod \"glance-operator-controller-manager-5697bb5779-64kcb\" (UID: \"b44e4d42-05a5-42e2-8a45-5d5506fbbb23\") " pod="openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.393314 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.399810 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.405254 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-b79wh" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.405729 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2s5c\" (UniqueName: \"kubernetes.io/projected/33445aa2-3e5b-4b50-ba7a-0f86d08dd64d-kube-api-access-s2s5c\") pod \"barbican-operator-controller-manager-7d9dfd778-spkzc\" (UID: \"33445aa2-3e5b-4b50-ba7a-0f86d08dd64d\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.407991 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.418161 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.425979 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.427180 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.428064 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5mbrr\" (UID: \"47e01596-c50b-44f5-82fb-1b6c7a005d10\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.428106 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smjj5\" (UniqueName: \"kubernetes.io/projected/51e5598f-4979-4f4d-a947-323c19dd3102-kube-api-access-smjj5\") pod \"manila-operator-controller-manager-5b5fd79c9c-swkcz\" (UID: \"51e5598f-4979-4f4d-a947-323c19dd3102\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.428130 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2c89\" (UniqueName: \"kubernetes.io/projected/47e01596-c50b-44f5-82fb-1b6c7a005d10-kube-api-access-m2c89\") pod \"infra-operator-controller-manager-78d48bff9d-5mbrr\" (UID: \"47e01596-c50b-44f5-82fb-1b6c7a005d10\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.428155 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf4z2\" (UniqueName: \"kubernetes.io/projected/5e629e44-f8b6-410a-baa9-b076e609686c-kube-api-access-zf4z2\") pod \"heat-operator-controller-manager-5f64f6f8bb-7dhp5\" (UID: \"5e629e44-f8b6-410a-baa9-b076e609686c\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.428176 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvs2n\" (UniqueName: \"kubernetes.io/projected/97f1308f-60a9-4e7f-b029-8bc13246ba9e-kube-api-access-qvs2n\") pod \"ironic-operator-controller-manager-967d97867-cv24b\" (UID: \"97f1308f-60a9-4e7f-b029-8bc13246ba9e\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-cv24b" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.428195 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzbws\" (UniqueName: \"kubernetes.io/projected/344dd244-42be-4538-92c0-ab4be8f8a093-kube-api-access-hzbws\") pod \"keystone-operator-controller-manager-7765d96ddf-fjhrn\" (UID: \"344dd244-42be-4538-92c0-ab4be8f8a093\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.428228 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4pnv\" (UniqueName: \"kubernetes.io/projected/76fef842-95b4-47f1-9c34-a4edc70a3cbf-kube-api-access-j4pnv\") pod \"horizon-operator-controller-manager-68c6d99b8f-bkrvn\" (UID: \"76fef842-95b4-47f1-9c34-a4edc70a3cbf\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn" Dec 08 20:19:53 crc kubenswrapper[4781]: E1208 20:19:53.428452 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 08 20:19:53 crc kubenswrapper[4781]: E1208 20:19:53.428492 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert podName:47e01596-c50b-44f5-82fb-1b6c7a005d10 nodeName:}" failed. No retries permitted until 2025-12-08 20:19:53.928479677 +0000 UTC m=+910.079763054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert") pod "infra-operator-controller-manager-78d48bff9d-5mbrr" (UID: "47e01596-c50b-44f5-82fb-1b6c7a005d10") : secret "infra-operator-webhook-server-cert" not found Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.435308 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.437414 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.443646 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.447797 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-5tnl7" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.449533 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-s2zf7" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.449963 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.463502 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.464569 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.465068 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.466039 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf4z2\" (UniqueName: \"kubernetes.io/projected/5e629e44-f8b6-410a-baa9-b076e609686c-kube-api-access-zf4z2\") pod \"heat-operator-controller-manager-5f64f6f8bb-7dhp5\" (UID: \"5e629e44-f8b6-410a-baa9-b076e609686c\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.466292 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-6s4p8" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.467390 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2c89\" (UniqueName: \"kubernetes.io/projected/47e01596-c50b-44f5-82fb-1b6c7a005d10-kube-api-access-m2c89\") pod \"infra-operator-controller-manager-78d48bff9d-5mbrr\" (UID: \"47e01596-c50b-44f5-82fb-1b6c7a005d10\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.471836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4pnv\" (UniqueName: \"kubernetes.io/projected/76fef842-95b4-47f1-9c34-a4edc70a3cbf-kube-api-access-j4pnv\") pod \"horizon-operator-controller-manager-68c6d99b8f-bkrvn\" (UID: \"76fef842-95b4-47f1-9c34-a4edc70a3cbf\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.486448 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.506992 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.516564 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvs2n\" (UniqueName: \"kubernetes.io/projected/97f1308f-60a9-4e7f-b029-8bc13246ba9e-kube-api-access-qvs2n\") pod \"ironic-operator-controller-manager-967d97867-cv24b\" (UID: \"97f1308f-60a9-4e7f-b029-8bc13246ba9e\") " pod="openstack-operators/ironic-operator-controller-manager-967d97867-cv24b" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.525051 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.529718 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4czr4\" (UniqueName: \"kubernetes.io/projected/e00e8983-d123-42e8-a4ef-2a2bbda78cde-kube-api-access-4czr4\") pod \"nova-operator-controller-manager-697bc559fc-p7f7t\" (UID: \"e00e8983-d123-42e8-a4ef-2a2bbda78cde\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.529768 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzbws\" (UniqueName: \"kubernetes.io/projected/344dd244-42be-4538-92c0-ab4be8f8a093-kube-api-access-hzbws\") pod \"keystone-operator-controller-manager-7765d96ddf-fjhrn\" (UID: \"344dd244-42be-4538-92c0-ab4be8f8a093\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.529812 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kxcx\" (UniqueName: \"kubernetes.io/projected/3b159a78-5c17-430a-ac90-f1d4e7fac757-kube-api-access-9kxcx\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-zkslx\" (UID: \"3b159a78-5c17-430a-ac90-f1d4e7fac757\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.529903 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smjj5\" (UniqueName: \"kubernetes.io/projected/51e5598f-4979-4f4d-a947-323c19dd3102-kube-api-access-smjj5\") pod \"manila-operator-controller-manager-5b5fd79c9c-swkcz\" (UID: \"51e5598f-4979-4f4d-a947-323c19dd3102\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.529949 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkkgr\" (UniqueName: \"kubernetes.io/projected/2c1ab608-2b41-451a-b6b9-2cf867ab289b-kube-api-access-mkkgr\") pod \"mariadb-operator-controller-manager-79c8c4686c-m9tns\" (UID: \"2c1ab608-2b41-451a-b6b9-2cf867ab289b\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.544237 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.548419 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.566961 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.567958 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzbws\" (UniqueName: \"kubernetes.io/projected/344dd244-42be-4538-92c0-ab4be8f8a093-kube-api-access-hzbws\") pod \"keystone-operator-controller-manager-7765d96ddf-fjhrn\" (UID: \"344dd244-42be-4538-92c0-ab4be8f8a093\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.574993 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-fql79"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.576056 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.588762 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-qrfjx" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.607634 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.608948 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.622462 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-967d97867-cv24b" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.626983 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-fql79"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.650354 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-dh8xl" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.650643 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgd5v\" (UniqueName: \"kubernetes.io/projected/b5739409-2e10-4acf-8088-99608fc2f489-kube-api-access-pgd5v\") pod \"ovn-operator-controller-manager-b6456fdb6-dnh9j\" (UID: \"b5739409-2e10-4acf-8088-99608fc2f489\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.650859 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4czr4\" (UniqueName: \"kubernetes.io/projected/e00e8983-d123-42e8-a4ef-2a2bbda78cde-kube-api-access-4czr4\") pod \"nova-operator-controller-manager-697bc559fc-p7f7t\" (UID: \"e00e8983-d123-42e8-a4ef-2a2bbda78cde\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.650968 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kxcx\" (UniqueName: \"kubernetes.io/projected/3b159a78-5c17-430a-ac90-f1d4e7fac757-kube-api-access-9kxcx\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-zkslx\" (UID: \"3b159a78-5c17-430a-ac90-f1d4e7fac757\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.650999 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxhzn\" (UniqueName: \"kubernetes.io/projected/78753ed6-755b-4e63-8026-50722a9637a9-kube-api-access-cxhzn\") pod \"octavia-operator-controller-manager-998648c74-fql79\" (UID: \"78753ed6-755b-4e63-8026-50722a9637a9\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.651148 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkkgr\" (UniqueName: \"kubernetes.io/projected/2c1ab608-2b41-451a-b6b9-2cf867ab289b-kube-api-access-mkkgr\") pod \"mariadb-operator-controller-manager-79c8c4686c-m9tns\" (UID: \"2c1ab608-2b41-451a-b6b9-2cf867ab289b\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.653066 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.665601 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.669726 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.680578 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.680801 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.682175 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.683605 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cfc5r" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.695894 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-jn4s5" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.702342 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.712686 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smjj5\" (UniqueName: \"kubernetes.io/projected/51e5598f-4979-4f4d-a947-323c19dd3102-kube-api-access-smjj5\") pod \"manila-operator-controller-manager-5b5fd79c9c-swkcz\" (UID: \"51e5598f-4979-4f4d-a947-323c19dd3102\") " pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.713288 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kxcx\" (UniqueName: \"kubernetes.io/projected/3b159a78-5c17-430a-ac90-f1d4e7fac757-kube-api-access-9kxcx\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-zkslx\" (UID: \"3b159a78-5c17-430a-ac90-f1d4e7fac757\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.717263 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.718599 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.723979 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-r6llx" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.731419 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkkgr\" (UniqueName: \"kubernetes.io/projected/2c1ab608-2b41-451a-b6b9-2cf867ab289b-kube-api-access-mkkgr\") pod \"mariadb-operator-controller-manager-79c8c4686c-m9tns\" (UID: \"2c1ab608-2b41-451a-b6b9-2cf867ab289b\") " pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.732864 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4czr4\" (UniqueName: \"kubernetes.io/projected/e00e8983-d123-42e8-a4ef-2a2bbda78cde-kube-api-access-4czr4\") pod \"nova-operator-controller-manager-697bc559fc-p7f7t\" (UID: \"e00e8983-d123-42e8-a4ef-2a2bbda78cde\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.753518 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgd5v\" (UniqueName: \"kubernetes.io/projected/b5739409-2e10-4acf-8088-99608fc2f489-kube-api-access-pgd5v\") pod \"ovn-operator-controller-manager-b6456fdb6-dnh9j\" (UID: \"b5739409-2e10-4acf-8088-99608fc2f489\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.753583 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6lsw\" (UniqueName: \"kubernetes.io/projected/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-kube-api-access-h6lsw\") pod \"openstack-baremetal-operator-controller-manager-744f8cb766hj2j5\" (UID: \"0bafa66c-7cf8-40eb-ae15-a4365fbe3176\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.753707 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxhzn\" (UniqueName: \"kubernetes.io/projected/78753ed6-755b-4e63-8026-50722a9637a9-kube-api-access-cxhzn\") pod \"octavia-operator-controller-manager-998648c74-fql79\" (UID: \"78753ed6-755b-4e63-8026-50722a9637a9\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.753753 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert\") pod \"openstack-baremetal-operator-controller-manager-744f8cb766hj2j5\" (UID: \"0bafa66c-7cf8-40eb-ae15-a4365fbe3176\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.753995 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvlkt\" (UniqueName: \"kubernetes.io/projected/b86b0a68-3152-46ac-8bde-3bfd32c6fbf2-kube-api-access-mvlkt\") pod \"placement-operator-controller-manager-78f8948974-rdrrr\" (UID: \"b86b0a68-3152-46ac-8bde-3bfd32c6fbf2\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.778120 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxhzn\" (UniqueName: \"kubernetes.io/projected/78753ed6-755b-4e63-8026-50722a9637a9-kube-api-access-cxhzn\") pod \"octavia-operator-controller-manager-998648c74-fql79\" (UID: \"78753ed6-755b-4e63-8026-50722a9637a9\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.783981 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.784041 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.810484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgd5v\" (UniqueName: \"kubernetes.io/projected/b5739409-2e10-4acf-8088-99608fc2f489-kube-api-access-pgd5v\") pod \"ovn-operator-controller-manager-b6456fdb6-dnh9j\" (UID: \"b5739409-2e10-4acf-8088-99608fc2f489\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.826908 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.842143 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.843167 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.846396 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-k5vxt" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.854524 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.858515 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert\") pod \"openstack-baremetal-operator-controller-manager-744f8cb766hj2j5\" (UID: \"0bafa66c-7cf8-40eb-ae15-a4365fbe3176\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.858582 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvlkt\" (UniqueName: \"kubernetes.io/projected/b86b0a68-3152-46ac-8bde-3bfd32c6fbf2-kube-api-access-mvlkt\") pod \"placement-operator-controller-manager-78f8948974-rdrrr\" (UID: \"b86b0a68-3152-46ac-8bde-3bfd32c6fbf2\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.858743 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6lsw\" (UniqueName: \"kubernetes.io/projected/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-kube-api-access-h6lsw\") pod \"openstack-baremetal-operator-controller-manager-744f8cb766hj2j5\" (UID: \"0bafa66c-7cf8-40eb-ae15-a4365fbe3176\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.858818 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9c7t\" (UniqueName: \"kubernetes.io/projected/9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e-kube-api-access-g9c7t\") pod \"swift-operator-controller-manager-9d58d64bc-24k5k\" (UID: \"9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" Dec 08 20:19:53 crc kubenswrapper[4781]: E1208 20:19:53.859512 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 20:19:53 crc kubenswrapper[4781]: E1208 20:19:53.859563 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert podName:0bafa66c-7cf8-40eb-ae15-a4365fbe3176 nodeName:}" failed. No retries permitted until 2025-12-08 20:19:54.359547634 +0000 UTC m=+910.510831001 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert") pod "openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" (UID: "0bafa66c-7cf8-40eb-ae15-a4365fbe3176") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.900755 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6lsw\" (UniqueName: \"kubernetes.io/projected/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-kube-api-access-h6lsw\") pod \"openstack-baremetal-operator-controller-manager-744f8cb766hj2j5\" (UID: \"0bafa66c-7cf8-40eb-ae15-a4365fbe3176\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.916265 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.932335 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.939788 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-wrt9m" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.944553 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvlkt\" (UniqueName: \"kubernetes.io/projected/b86b0a68-3152-46ac-8bde-3bfd32c6fbf2-kube-api-access-mvlkt\") pod \"placement-operator-controller-manager-78f8948974-rdrrr\" (UID: \"b86b0a68-3152-46ac-8bde-3bfd32c6fbf2\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.967317 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsjl8\" (UniqueName: \"kubernetes.io/projected/7f697324-2ce1-4c81-89c8-9cc53bac7062-kube-api-access-fsjl8\") pod \"telemetry-operator-controller-manager-58d5ff84df-j5vws\" (UID: \"7f697324-2ce1-4c81-89c8-9cc53bac7062\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.970469 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5mbrr\" (UID: \"47e01596-c50b-44f5-82fb-1b6c7a005d10\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.970612 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9c7t\" (UniqueName: \"kubernetes.io/projected/9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e-kube-api-access-g9c7t\") pod \"swift-operator-controller-manager-9d58d64bc-24k5k\" (UID: \"9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.971447 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp"] Dec 08 20:19:53 crc kubenswrapper[4781]: E1208 20:19:53.971501 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 08 20:19:53 crc kubenswrapper[4781]: E1208 20:19:53.971564 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert podName:47e01596-c50b-44f5-82fb-1b6c7a005d10 nodeName:}" failed. No retries permitted until 2025-12-08 20:19:54.971539773 +0000 UTC m=+911.122823150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert") pod "infra-operator-controller-manager-78d48bff9d-5mbrr" (UID: "47e01596-c50b-44f5-82fb-1b6c7a005d10") : secret "infra-operator-webhook-server-cert" not found Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.977042 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46"] Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.979047 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.987083 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-ml87v" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.988449 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" Dec 08 20:19:53 crc kubenswrapper[4781]: I1208 20:19:53.997110 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.003931 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46"] Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.027983 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.028685 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.051476 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.060680 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9c7t\" (UniqueName: \"kubernetes.io/projected/9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e-kube-api-access-g9c7t\") pod \"swift-operator-controller-manager-9d58d64bc-24k5k\" (UID: \"9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e\") " pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.071544 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rllrx\" (UniqueName: \"kubernetes.io/projected/908c60a3-663f-4a4a-9223-fbcff50de2b9-kube-api-access-rllrx\") pod \"test-operator-controller-manager-5854674fcc-2b4gp\" (UID: \"908c60a3-663f-4a4a-9223-fbcff50de2b9\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.071636 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsjl8\" (UniqueName: \"kubernetes.io/projected/7f697324-2ce1-4c81-89c8-9cc53bac7062-kube-api-access-fsjl8\") pod \"telemetry-operator-controller-manager-58d5ff84df-j5vws\" (UID: \"7f697324-2ce1-4c81-89c8-9cc53bac7062\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.071706 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmzh9\" (UniqueName: \"kubernetes.io/projected/db28fdab-dab1-4d5c-9447-c895523b0985-kube-api-access-mmzh9\") pod \"watcher-operator-controller-manager-667bd8d554-fcn46\" (UID: \"db28fdab-dab1-4d5c-9447-c895523b0985\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.077666 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf"] Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.078656 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.087115 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.087986 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.088320 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-bw89m" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.095251 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf"] Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.096825 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.113083 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g"] Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.113950 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsjl8\" (UniqueName: \"kubernetes.io/projected/7f697324-2ce1-4c81-89c8-9cc53bac7062-kube-api-access-fsjl8\") pod \"telemetry-operator-controller-manager-58d5ff84df-j5vws\" (UID: \"7f697324-2ce1-4c81-89c8-9cc53bac7062\") " pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.133600 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.156505 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.172728 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.172776 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rllrx\" (UniqueName: \"kubernetes.io/projected/908c60a3-663f-4a4a-9223-fbcff50de2b9-kube-api-access-rllrx\") pod \"test-operator-controller-manager-5854674fcc-2b4gp\" (UID: \"908c60a3-663f-4a4a-9223-fbcff50de2b9\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.172967 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.173064 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmzh9\" (UniqueName: \"kubernetes.io/projected/db28fdab-dab1-4d5c-9447-c895523b0985-kube-api-access-mmzh9\") pod \"watcher-operator-controller-manager-667bd8d554-fcn46\" (UID: \"db28fdab-dab1-4d5c-9447-c895523b0985\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.173147 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hjwx\" (UniqueName: \"kubernetes.io/projected/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-kube-api-access-8hjwx\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.179458 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9"] Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.180287 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9"] Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.180387 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.185425 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-fjkc8" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.198596 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rllrx\" (UniqueName: \"kubernetes.io/projected/908c60a3-663f-4a4a-9223-fbcff50de2b9-kube-api-access-rllrx\") pod \"test-operator-controller-manager-5854674fcc-2b4gp\" (UID: \"908c60a3-663f-4a4a-9223-fbcff50de2b9\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.201141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmzh9\" (UniqueName: \"kubernetes.io/projected/db28fdab-dab1-4d5c-9447-c895523b0985-kube-api-access-mmzh9\") pod \"watcher-operator-controller-manager-667bd8d554-fcn46\" (UID: \"db28fdab-dab1-4d5c-9447-c895523b0985\") " pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.201510 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" Dec 08 20:19:54 crc kubenswrapper[4781]: W1208 20:19:54.215311 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17686434_377d_4a2f_b25e_e0074d2e06c6.slice/crio-df3c4c7e8853d383bdf962d47c718e2fcf74dd58919a7d71ee200c4f7803824c WatchSource:0}: Error finding container df3c4c7e8853d383bdf962d47c718e2fcf74dd58919a7d71ee200c4f7803824c: Status 404 returned error can't find the container with id df3c4c7e8853d383bdf962d47c718e2fcf74dd58919a7d71ee200c4f7803824c Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.276689 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.276969 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.277043 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.277083 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2rc2\" (UniqueName: \"kubernetes.io/projected/60bdaea6-28dd-4dac-b1b8-a046ea0c90e0-kube-api-access-f2rc2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-p26p9\" (UID: \"60bdaea6-28dd-4dac-b1b8-a046ea0c90e0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.277108 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hjwx\" (UniqueName: \"kubernetes.io/projected/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-kube-api-access-8hjwx\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:54 crc kubenswrapper[4781]: E1208 20:19:54.277472 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 08 20:19:54 crc kubenswrapper[4781]: E1208 20:19:54.277506 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs podName:0f03e65d-f0b8-4cfa-90bd-4a70de607c2d nodeName:}" failed. No retries permitted until 2025-12-08 20:19:54.777492685 +0000 UTC m=+910.928776062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs") pod "openstack-operator-controller-manager-5b47684954-hcdhf" (UID: "0f03e65d-f0b8-4cfa-90bd-4a70de607c2d") : secret "metrics-server-cert" not found Dec 08 20:19:54 crc kubenswrapper[4781]: E1208 20:19:54.277644 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 08 20:19:54 crc kubenswrapper[4781]: E1208 20:19:54.277668 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs podName:0f03e65d-f0b8-4cfa-90bd-4a70de607c2d nodeName:}" failed. No retries permitted until 2025-12-08 20:19:54.777660789 +0000 UTC m=+910.928944166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs") pod "openstack-operator-controller-manager-5b47684954-hcdhf" (UID: "0f03e65d-f0b8-4cfa-90bd-4a70de607c2d") : secret "webhook-server-cert" not found Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.305339 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hjwx\" (UniqueName: \"kubernetes.io/projected/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-kube-api-access-8hjwx\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.329120 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.378664 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert\") pod \"openstack-baremetal-operator-controller-manager-744f8cb766hj2j5\" (UID: \"0bafa66c-7cf8-40eb-ae15-a4365fbe3176\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.379005 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2rc2\" (UniqueName: \"kubernetes.io/projected/60bdaea6-28dd-4dac-b1b8-a046ea0c90e0-kube-api-access-f2rc2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-p26p9\" (UID: \"60bdaea6-28dd-4dac-b1b8-a046ea0c90e0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9" Dec 08 20:19:54 crc kubenswrapper[4781]: E1208 20:19:54.378947 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 20:19:54 crc kubenswrapper[4781]: E1208 20:19:54.379102 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert podName:0bafa66c-7cf8-40eb-ae15-a4365fbe3176 nodeName:}" failed. No retries permitted until 2025-12-08 20:19:55.379088224 +0000 UTC m=+911.530371601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert") pod "openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" (UID: "0bafa66c-7cf8-40eb-ae15-a4365fbe3176") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.485878 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2rc2\" (UniqueName: \"kubernetes.io/projected/60bdaea6-28dd-4dac-b1b8-a046ea0c90e0-kube-api-access-f2rc2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-p26p9\" (UID: \"60bdaea6-28dd-4dac-b1b8-a046ea0c90e0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.562742 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.637137 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2"] Dec 08 20:19:54 crc kubenswrapper[4781]: W1208 20:19:54.681180 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1ed3c21_a6cb_43f6_a018_8bead69a5439.slice/crio-61b74391f8328323935b0659db1c7e6ce2ca67faa3ab665946564382cf7bf813 WatchSource:0}: Error finding container 61b74391f8328323935b0659db1c7e6ce2ca67faa3ab665946564382cf7bf813: Status 404 returned error can't find the container with id 61b74391f8328323935b0659db1c7e6ce2ca67faa3ab665946564382cf7bf813 Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.791783 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.791879 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:54 crc kubenswrapper[4781]: E1208 20:19:54.792082 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 08 20:19:54 crc kubenswrapper[4781]: E1208 20:19:54.792140 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs podName:0f03e65d-f0b8-4cfa-90bd-4a70de607c2d nodeName:}" failed. No retries permitted until 2025-12-08 20:19:55.792122273 +0000 UTC m=+911.943405660 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs") pod "openstack-operator-controller-manager-5b47684954-hcdhf" (UID: "0f03e65d-f0b8-4cfa-90bd-4a70de607c2d") : secret "metrics-server-cert" not found Dec 08 20:19:54 crc kubenswrapper[4781]: E1208 20:19:54.792541 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 08 20:19:54 crc kubenswrapper[4781]: E1208 20:19:54.792568 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs podName:0f03e65d-f0b8-4cfa-90bd-4a70de607c2d nodeName:}" failed. No retries permitted until 2025-12-08 20:19:55.792559346 +0000 UTC m=+911.943842723 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs") pod "openstack-operator-controller-manager-5b47684954-hcdhf" (UID: "0f03e65d-f0b8-4cfa-90bd-4a70de607c2d") : secret "webhook-server-cert" not found Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.804833 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2" event={"ID":"c1ed3c21-a6cb-43f6-a018-8bead69a5439","Type":"ContainerStarted","Data":"61b74391f8328323935b0659db1c7e6ce2ca67faa3ab665946564382cf7bf813"} Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.812589 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g" event={"ID":"17686434-377d-4a2f-b25e-e0074d2e06c6","Type":"ContainerStarted","Data":"df3c4c7e8853d383bdf962d47c718e2fcf74dd58919a7d71ee200c4f7803824c"} Dec 08 20:19:54 crc kubenswrapper[4781]: I1208 20:19:54.998334 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5mbrr\" (UID: \"47e01596-c50b-44f5-82fb-1b6c7a005d10\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:19:54 crc kubenswrapper[4781]: E1208 20:19:54.998609 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 08 20:19:54 crc kubenswrapper[4781]: E1208 20:19:54.998654 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert podName:47e01596-c50b-44f5-82fb-1b6c7a005d10 nodeName:}" failed. No retries permitted until 2025-12-08 20:19:56.998640518 +0000 UTC m=+913.149923895 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert") pod "infra-operator-controller-manager-78d48bff9d-5mbrr" (UID: "47e01596-c50b-44f5-82fb-1b6c7a005d10") : secret "infra-operator-webhook-server-cert" not found Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.037619 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn"] Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.055755 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5"] Dec 08 20:19:55 crc kubenswrapper[4781]: W1208 20:19:55.056336 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e629e44_f8b6_410a_baa9_b076e609686c.slice/crio-6cdaa7bd402c1871885eddd4f5e67e9f49462c8fa4221abc51f7f76ac1928f57 WatchSource:0}: Error finding container 6cdaa7bd402c1871885eddd4f5e67e9f49462c8fa4221abc51f7f76ac1928f57: Status 404 returned error can't find the container with id 6cdaa7bd402c1871885eddd4f5e67e9f49462c8fa4221abc51f7f76ac1928f57 Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.071990 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn"] Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.110718 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb"] Dec 08 20:19:55 crc kubenswrapper[4781]: W1208 20:19:55.119901 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb44e4d42_05a5_42e2_8a45_5d5506fbbb23.slice/crio-031f342c3ab3ee8da8c2fe02501d275f166afb3c998241b719c67e1b74539555 WatchSource:0}: Error finding container 031f342c3ab3ee8da8c2fe02501d275f166afb3c998241b719c67e1b74539555: Status 404 returned error can't find the container with id 031f342c3ab3ee8da8c2fe02501d275f166afb3c998241b719c67e1b74539555 Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.164096 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc"] Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.172044 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-967d97867-cv24b"] Dec 08 20:19:55 crc kubenswrapper[4781]: W1208 20:19:55.184736 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f1308f_60a9_4e7f_b029_8bc13246ba9e.slice/crio-a1f5b16ef1547390ef034d3fdc5caa8a595307aac3c3817d5398af033ee5d457 WatchSource:0}: Error finding container a1f5b16ef1547390ef034d3fdc5caa8a595307aac3c3817d5398af033ee5d457: Status 404 returned error can't find the container with id a1f5b16ef1547390ef034d3fdc5caa8a595307aac3c3817d5398af033ee5d457 Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.246653 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns"] Dec 08 20:19:55 crc kubenswrapper[4781]: W1208 20:19:55.250084 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c1ab608_2b41_451a_b6b9_2cf867ab289b.slice/crio-4fec4c04fa70c85fcc85845e5a7f9e27715632a5f73fdf7114d13b00a009096d WatchSource:0}: Error finding container 4fec4c04fa70c85fcc85845e5a7f9e27715632a5f73fdf7114d13b00a009096d: Status 404 returned error can't find the container with id 4fec4c04fa70c85fcc85845e5a7f9e27715632a5f73fdf7114d13b00a009096d Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.404936 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert\") pod \"openstack-baremetal-operator-controller-manager-744f8cb766hj2j5\" (UID: \"0bafa66c-7cf8-40eb-ae15-a4365fbe3176\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.405452 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.405502 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert podName:0bafa66c-7cf8-40eb-ae15-a4365fbe3176 nodeName:}" failed. No retries permitted until 2025-12-08 20:19:57.405486159 +0000 UTC m=+913.556769536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert") pod "openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" (UID: "0bafa66c-7cf8-40eb-ae15-a4365fbe3176") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.525330 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr"] Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.536446 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t"] Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.542212 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx"] Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.552833 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46"] Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.557636 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j"] Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.572104 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-fql79"] Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.587231 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz"] Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.594129 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp"] Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.600809 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9"] Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.607102 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws"] Dec 08 20:19:55 crc kubenswrapper[4781]: W1208 20:19:55.610770 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb28fdab_dab1_4d5c_9447_c895523b0985.slice/crio-da9e465693465f9d0f7e672ca192046c216172d1f1fea24b26bee46f1be4ba80 WatchSource:0}: Error finding container da9e465693465f9d0f7e672ca192046c216172d1f1fea24b26bee46f1be4ba80: Status 404 returned error can't find the container with id da9e465693465f9d0f7e672ca192046c216172d1f1fea24b26bee46f1be4ba80 Dec 08 20:19:55 crc kubenswrapper[4781]: W1208 20:19:55.612408 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5739409_2e10_4acf_8088_99608fc2f489.slice/crio-9ee8d20064693cefd7ce55ca448ea69d77c948e204c47a382d300e0956fb2300 WatchSource:0}: Error finding container 9ee8d20064693cefd7ce55ca448ea69d77c948e204c47a382d300e0956fb2300: Status 404 returned error can't find the container with id 9ee8d20064693cefd7ce55ca448ea69d77c948e204c47a382d300e0956fb2300 Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.613483 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k"] Dec 08 20:19:55 crc kubenswrapper[4781]: W1208 20:19:55.616888 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f697324_2ce1_4c81_89c8_9cc53bac7062.slice/crio-92b3ef5492eeb7d99edfb51917baefb7be29e01bb7be7a6c2a77829985a2c4ad WatchSource:0}: Error finding container 92b3ef5492eeb7d99edfb51917baefb7be29e01bb7be7a6c2a77829985a2c4ad: Status 404 returned error can't find the container with id 92b3ef5492eeb7d99edfb51917baefb7be29e01bb7be7a6c2a77829985a2c4ad Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.620205 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g9c7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-24k5k_openstack-operators(9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.620214 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cxhzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-fql79_openstack-operators(78753ed6-755b-4e63-8026-50722a9637a9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 20:19:55 crc kubenswrapper[4781]: W1208 20:19:55.629410 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60bdaea6_28dd_4dac_b1b8_a046ea0c90e0.slice/crio-38e2e2e971fd56694231f00a0dbbb5ac9fea3fb51d93f7387f87f117c50d2151 WatchSource:0}: Error finding container 38e2e2e971fd56694231f00a0dbbb5ac9fea3fb51d93f7387f87f117c50d2151: Status 404 returned error can't find the container with id 38e2e2e971fd56694231f00a0dbbb5ac9fea3fb51d93f7387f87f117c50d2151 Dec 08 20:19:55 crc kubenswrapper[4781]: W1208 20:19:55.630876 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod908c60a3_663f_4a4a_9223_fbcff50de2b9.slice/crio-bb0fd662ea98fcd920f9c52ced9df5f43f65d54c2190ecb21f5b68e33acedddc WatchSource:0}: Error finding container bb0fd662ea98fcd920f9c52ced9df5f43f65d54c2190ecb21f5b68e33acedddc: Status 404 returned error can't find the container with id bb0fd662ea98fcd920f9c52ced9df5f43f65d54c2190ecb21f5b68e33acedddc Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.631702 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cxhzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-fql79_openstack-operators(78753ed6-755b-4e63-8026-50722a9637a9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.631703 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g9c7t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9d58d64bc-24k5k_openstack-operators(9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.632444 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fsjl8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-j5vws_openstack-operators(7f697324-2ce1-4c81-89c8-9cc53bac7062): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.634864 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fsjl8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-58d5ff84df-j5vws_openstack-operators(7f697324-2ce1-4c81-89c8-9cc53bac7062): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.634992 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" podUID="9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.635042 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" podUID="78753ed6-755b-4e63-8026-50722a9637a9" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.636065 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" podUID="7f697324-2ce1-4c81-89c8-9cc53bac7062" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.636306 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rllrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-2b4gp_openstack-operators(908c60a3-663f-4a4a-9223-fbcff50de2b9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.636906 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f2rc2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-p26p9_openstack-operators(60bdaea6-28dd-4dac-b1b8-a046ea0c90e0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.638168 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9" podUID="60bdaea6-28dd-4dac-b1b8-a046ea0c90e0" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.640091 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rllrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-2b4gp_openstack-operators(908c60a3-663f-4a4a-9223-fbcff50de2b9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.640477 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-smjj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5b5fd79c9c-swkcz_openstack-operators(51e5598f-4979-4f4d-a947-323c19dd3102): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.641333 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" podUID="908c60a3-663f-4a4a-9223-fbcff50de2b9" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.643480 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-smjj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5b5fd79c9c-swkcz_openstack-operators(51e5598f-4979-4f4d-a947-323c19dd3102): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.644981 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" podUID="51e5598f-4979-4f4d-a947-323c19dd3102" Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.817792 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.817887 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.817977 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.818253 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs podName:0f03e65d-f0b8-4cfa-90bd-4a70de607c2d nodeName:}" failed. No retries permitted until 2025-12-08 20:19:57.81823615 +0000 UTC m=+913.969519537 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs") pod "openstack-operator-controller-manager-5b47684954-hcdhf" (UID: "0f03e65d-f0b8-4cfa-90bd-4a70de607c2d") : secret "webhook-server-cert" not found Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.818319 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.818370 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs podName:0f03e65d-f0b8-4cfa-90bd-4a70de607c2d nodeName:}" failed. No retries permitted until 2025-12-08 20:19:57.818355343 +0000 UTC m=+913.969638720 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs") pod "openstack-operator-controller-manager-5b47684954-hcdhf" (UID: "0f03e65d-f0b8-4cfa-90bd-4a70de607c2d") : secret "metrics-server-cert" not found Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.833348 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" event={"ID":"9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e","Type":"ContainerStarted","Data":"1a1f32847c23c08e1d2e153cafc1c39784a3a409b1311c5048dd227246ee48a5"} Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.836771 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" event={"ID":"7f697324-2ce1-4c81-89c8-9cc53bac7062","Type":"ContainerStarted","Data":"92b3ef5492eeb7d99edfb51917baefb7be29e01bb7be7a6c2a77829985a2c4ad"} Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.846493 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" podUID="9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.848853 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" podUID="7f697324-2ce1-4c81-89c8-9cc53bac7062" Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.857288 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx" event={"ID":"3b159a78-5c17-430a-ac90-f1d4e7fac757","Type":"ContainerStarted","Data":"0944af5d597af55a6d6c6b2d7c094695ff733dfebf117b16e256b7cafde37454"} Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.866402 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9" event={"ID":"60bdaea6-28dd-4dac-b1b8-a046ea0c90e0","Type":"ContainerStarted","Data":"38e2e2e971fd56694231f00a0dbbb5ac9fea3fb51d93f7387f87f117c50d2151"} Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.868370 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9" podUID="60bdaea6-28dd-4dac-b1b8-a046ea0c90e0" Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.869571 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46" event={"ID":"db28fdab-dab1-4d5c-9447-c895523b0985","Type":"ContainerStarted","Data":"da9e465693465f9d0f7e672ca192046c216172d1f1fea24b26bee46f1be4ba80"} Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.871384 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t" event={"ID":"e00e8983-d123-42e8-a4ef-2a2bbda78cde","Type":"ContainerStarted","Data":"475b407003f4df0d8e8109ab0a0c33325fa786383c464293bb5610f2f5006852"} Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.874315 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn" event={"ID":"344dd244-42be-4538-92c0-ab4be8f8a093","Type":"ContainerStarted","Data":"e4e6cfa55ff1454ccef839edc4e1a214311e0d3a559b888ebb3b0bb072201d0a"} Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.887172 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb" event={"ID":"b44e4d42-05a5-42e2-8a45-5d5506fbbb23","Type":"ContainerStarted","Data":"031f342c3ab3ee8da8c2fe02501d275f166afb3c998241b719c67e1b74539555"} Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.888637 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-cv24b" event={"ID":"97f1308f-60a9-4e7f-b029-8bc13246ba9e","Type":"ContainerStarted","Data":"a1f5b16ef1547390ef034d3fdc5caa8a595307aac3c3817d5398af033ee5d457"} Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.893591 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" event={"ID":"51e5598f-4979-4f4d-a947-323c19dd3102","Type":"ContainerStarted","Data":"55089e71ea631854ed52a6ad8fd940d0e09fa3e506b847c36a73a48eba51be4b"} Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.896526 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" event={"ID":"78753ed6-755b-4e63-8026-50722a9637a9","Type":"ContainerStarted","Data":"1e01022c4bbea409a89d6e1fe2ef71033a4fc8d10ef1077e8ea4226ab922fb30"} Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.898078 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" podUID="51e5598f-4979-4f4d-a947-323c19dd3102" Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.898965 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" podUID="78753ed6-755b-4e63-8026-50722a9637a9" Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.900432 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn" event={"ID":"76fef842-95b4-47f1-9c34-a4edc70a3cbf","Type":"ContainerStarted","Data":"491125f0574f822eb724a1b5dfa9ee104623cd83e48ddc0386ae0ababc21ac9e"} Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.904345 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr" event={"ID":"b86b0a68-3152-46ac-8bde-3bfd32c6fbf2","Type":"ContainerStarted","Data":"266f2b05b1b00455a120f87b4d7a5db2da9d2a5cd55325454a0ed9a98eed8429"} Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.914519 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j" event={"ID":"b5739409-2e10-4acf-8088-99608fc2f489","Type":"ContainerStarted","Data":"9ee8d20064693cefd7ce55ca448ea69d77c948e204c47a382d300e0956fb2300"} Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.917828 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5" event={"ID":"5e629e44-f8b6-410a-baa9-b076e609686c","Type":"ContainerStarted","Data":"6cdaa7bd402c1871885eddd4f5e67e9f49462c8fa4221abc51f7f76ac1928f57"} Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.946747 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" event={"ID":"908c60a3-663f-4a4a-9223-fbcff50de2b9","Type":"ContainerStarted","Data":"bb0fd662ea98fcd920f9c52ced9df5f43f65d54c2190ecb21f5b68e33acedddc"} Dec 08 20:19:55 crc kubenswrapper[4781]: E1208 20:19:55.950293 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" podUID="908c60a3-663f-4a4a-9223-fbcff50de2b9" Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.955281 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns" event={"ID":"2c1ab608-2b41-451a-b6b9-2cf867ab289b","Type":"ContainerStarted","Data":"4fec4c04fa70c85fcc85845e5a7f9e27715632a5f73fdf7114d13b00a009096d"} Dec 08 20:19:55 crc kubenswrapper[4781]: I1208 20:19:55.966388 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc" event={"ID":"33445aa2-3e5b-4b50-ba7a-0f86d08dd64d","Type":"ContainerStarted","Data":"4976f2bc46453fcf4b79304c2800bd8ab35ac36c8946d22e37953bdc4abde1b7"} Dec 08 20:19:56 crc kubenswrapper[4781]: E1208 20:19:56.981344 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9" podUID="60bdaea6-28dd-4dac-b1b8-a046ea0c90e0" Dec 08 20:19:56 crc kubenswrapper[4781]: E1208 20:19:56.981533 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" podUID="908c60a3-663f-4a4a-9223-fbcff50de2b9" Dec 08 20:19:56 crc kubenswrapper[4781]: E1208 20:19:56.981557 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" podUID="51e5598f-4979-4f4d-a947-323c19dd3102" Dec 08 20:19:56 crc kubenswrapper[4781]: E1208 20:19:56.981587 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" podUID="78753ed6-755b-4e63-8026-50722a9637a9" Dec 08 20:19:56 crc kubenswrapper[4781]: E1208 20:19:56.981623 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" podUID="9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e" Dec 08 20:19:56 crc kubenswrapper[4781]: E1208 20:19:56.983359 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f27e732ec1faee765461bf137d9be81278b2fa39675019a73622755e1e610b6f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" podUID="7f697324-2ce1-4c81-89c8-9cc53bac7062" Dec 08 20:19:57 crc kubenswrapper[4781]: I1208 20:19:57.048446 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5mbrr\" (UID: \"47e01596-c50b-44f5-82fb-1b6c7a005d10\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:19:57 crc kubenswrapper[4781]: E1208 20:19:57.049126 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 08 20:19:57 crc kubenswrapper[4781]: E1208 20:19:57.049191 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert podName:47e01596-c50b-44f5-82fb-1b6c7a005d10 nodeName:}" failed. No retries permitted until 2025-12-08 20:20:01.049173061 +0000 UTC m=+917.200456428 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert") pod "infra-operator-controller-manager-78d48bff9d-5mbrr" (UID: "47e01596-c50b-44f5-82fb-1b6c7a005d10") : secret "infra-operator-webhook-server-cert" not found Dec 08 20:19:57 crc kubenswrapper[4781]: I1208 20:19:57.453030 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert\") pod \"openstack-baremetal-operator-controller-manager-744f8cb766hj2j5\" (UID: \"0bafa66c-7cf8-40eb-ae15-a4365fbe3176\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:19:57 crc kubenswrapper[4781]: E1208 20:19:57.453174 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 20:19:57 crc kubenswrapper[4781]: E1208 20:19:57.453234 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert podName:0bafa66c-7cf8-40eb-ae15-a4365fbe3176 nodeName:}" failed. No retries permitted until 2025-12-08 20:20:01.453216212 +0000 UTC m=+917.604499589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert") pod "openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" (UID: "0bafa66c-7cf8-40eb-ae15-a4365fbe3176") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 20:19:57 crc kubenswrapper[4781]: I1208 20:19:57.891749 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:57 crc kubenswrapper[4781]: I1208 20:19:57.891828 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:19:57 crc kubenswrapper[4781]: E1208 20:19:57.891950 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 08 20:19:57 crc kubenswrapper[4781]: E1208 20:19:57.892003 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 08 20:19:57 crc kubenswrapper[4781]: E1208 20:19:57.892029 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs podName:0f03e65d-f0b8-4cfa-90bd-4a70de607c2d nodeName:}" failed. No retries permitted until 2025-12-08 20:20:01.892004611 +0000 UTC m=+918.043287988 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs") pod "openstack-operator-controller-manager-5b47684954-hcdhf" (UID: "0f03e65d-f0b8-4cfa-90bd-4a70de607c2d") : secret "webhook-server-cert" not found Dec 08 20:19:57 crc kubenswrapper[4781]: E1208 20:19:57.892053 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs podName:0f03e65d-f0b8-4cfa-90bd-4a70de607c2d nodeName:}" failed. No retries permitted until 2025-12-08 20:20:01.892038322 +0000 UTC m=+918.043321699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs") pod "openstack-operator-controller-manager-5b47684954-hcdhf" (UID: "0f03e65d-f0b8-4cfa-90bd-4a70de607c2d") : secret "metrics-server-cert" not found Dec 08 20:20:01 crc kubenswrapper[4781]: I1208 20:20:01.081338 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5mbrr\" (UID: \"47e01596-c50b-44f5-82fb-1b6c7a005d10\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:20:01 crc kubenswrapper[4781]: E1208 20:20:01.081555 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 08 20:20:01 crc kubenswrapper[4781]: E1208 20:20:01.081831 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert podName:47e01596-c50b-44f5-82fb-1b6c7a005d10 nodeName:}" failed. No retries permitted until 2025-12-08 20:20:09.081809543 +0000 UTC m=+925.233092930 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert") pod "infra-operator-controller-manager-78d48bff9d-5mbrr" (UID: "47e01596-c50b-44f5-82fb-1b6c7a005d10") : secret "infra-operator-webhook-server-cert" not found Dec 08 20:20:01 crc kubenswrapper[4781]: I1208 20:20:01.497282 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert\") pod \"openstack-baremetal-operator-controller-manager-744f8cb766hj2j5\" (UID: \"0bafa66c-7cf8-40eb-ae15-a4365fbe3176\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:20:01 crc kubenswrapper[4781]: E1208 20:20:01.497425 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 20:20:01 crc kubenswrapper[4781]: E1208 20:20:01.497518 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert podName:0bafa66c-7cf8-40eb-ae15-a4365fbe3176 nodeName:}" failed. No retries permitted until 2025-12-08 20:20:09.497494538 +0000 UTC m=+925.648777975 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert") pod "openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" (UID: "0bafa66c-7cf8-40eb-ae15-a4365fbe3176") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 20:20:01 crc kubenswrapper[4781]: I1208 20:20:01.903267 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:20:01 crc kubenswrapper[4781]: I1208 20:20:01.903401 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:20:01 crc kubenswrapper[4781]: E1208 20:20:01.903532 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 08 20:20:01 crc kubenswrapper[4781]: E1208 20:20:01.903689 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs podName:0f03e65d-f0b8-4cfa-90bd-4a70de607c2d nodeName:}" failed. No retries permitted until 2025-12-08 20:20:09.903639649 +0000 UTC m=+926.054923026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs") pod "openstack-operator-controller-manager-5b47684954-hcdhf" (UID: "0f03e65d-f0b8-4cfa-90bd-4a70de607c2d") : secret "webhook-server-cert" not found Dec 08 20:20:01 crc kubenswrapper[4781]: E1208 20:20:01.903603 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 08 20:20:01 crc kubenswrapper[4781]: E1208 20:20:01.903840 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs podName:0f03e65d-f0b8-4cfa-90bd-4a70de607c2d nodeName:}" failed. No retries permitted until 2025-12-08 20:20:09.903822614 +0000 UTC m=+926.055105991 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs") pod "openstack-operator-controller-manager-5b47684954-hcdhf" (UID: "0f03e65d-f0b8-4cfa-90bd-4a70de607c2d") : secret "metrics-server-cert" not found Dec 08 20:20:08 crc kubenswrapper[4781]: E1208 20:20:08.022571 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:981b6a8f95934a86c5f10ef6e198b07265aeba7f11cf84b9ccd13dfaf06f3ca3" Dec 08 20:20:08 crc kubenswrapper[4781]: E1208 20:20:08.023329 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:981b6a8f95934a86c5f10ef6e198b07265aeba7f11cf84b9ccd13dfaf06f3ca3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f4swk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6c677c69b-tk48g_openstack-operators(17686434-377d-4a2f-b25e-e0074d2e06c6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:20:09 crc kubenswrapper[4781]: I1208 20:20:09.215545 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5mbrr\" (UID: \"47e01596-c50b-44f5-82fb-1b6c7a005d10\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:20:09 crc kubenswrapper[4781]: E1208 20:20:09.215685 4781 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 08 20:20:09 crc kubenswrapper[4781]: E1208 20:20:09.216273 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert podName:47e01596-c50b-44f5-82fb-1b6c7a005d10 nodeName:}" failed. No retries permitted until 2025-12-08 20:20:25.216241553 +0000 UTC m=+941.367524930 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert") pod "infra-operator-controller-manager-78d48bff9d-5mbrr" (UID: "47e01596-c50b-44f5-82fb-1b6c7a005d10") : secret "infra-operator-webhook-server-cert" not found Dec 08 20:20:09 crc kubenswrapper[4781]: I1208 20:20:09.518404 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert\") pod \"openstack-baremetal-operator-controller-manager-744f8cb766hj2j5\" (UID: \"0bafa66c-7cf8-40eb-ae15-a4365fbe3176\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:20:09 crc kubenswrapper[4781]: E1208 20:20:09.518620 4781 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 20:20:09 crc kubenswrapper[4781]: E1208 20:20:09.518705 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert podName:0bafa66c-7cf8-40eb-ae15-a4365fbe3176 nodeName:}" failed. No retries permitted until 2025-12-08 20:20:25.518686344 +0000 UTC m=+941.669969711 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert") pod "openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" (UID: "0bafa66c-7cf8-40eb-ae15-a4365fbe3176") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 08 20:20:09 crc kubenswrapper[4781]: I1208 20:20:09.922689 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:20:09 crc kubenswrapper[4781]: I1208 20:20:09.922824 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:20:09 crc kubenswrapper[4781]: E1208 20:20:09.922823 4781 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 08 20:20:09 crc kubenswrapper[4781]: E1208 20:20:09.922865 4781 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 08 20:20:09 crc kubenswrapper[4781]: E1208 20:20:09.922970 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs podName:0f03e65d-f0b8-4cfa-90bd-4a70de607c2d nodeName:}" failed. No retries permitted until 2025-12-08 20:20:25.922948961 +0000 UTC m=+942.074232338 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs") pod "openstack-operator-controller-manager-5b47684954-hcdhf" (UID: "0f03e65d-f0b8-4cfa-90bd-4a70de607c2d") : secret "metrics-server-cert" not found Dec 08 20:20:09 crc kubenswrapper[4781]: E1208 20:20:09.923030 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs podName:0f03e65d-f0b8-4cfa-90bd-4a70de607c2d nodeName:}" failed. No retries permitted until 2025-12-08 20:20:25.923007253 +0000 UTC m=+942.074290690 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs") pod "openstack-operator-controller-manager-5b47684954-hcdhf" (UID: "0f03e65d-f0b8-4cfa-90bd-4a70de607c2d") : secret "webhook-server-cert" not found Dec 08 20:20:12 crc kubenswrapper[4781]: E1208 20:20:12.225579 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad" Dec 08 20:20:12 crc kubenswrapper[4781]: E1208 20:20:12.226123 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mkkgr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-79c8c4686c-m9tns_openstack-operators(2c1ab608-2b41-451a-b6b9-2cf867ab289b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:20:13 crc kubenswrapper[4781]: E1208 20:20:13.151211 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a" Dec 08 20:20:13 crc kubenswrapper[4781]: E1208 20:20:13.151425 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:900050d3501c0785b227db34b89883efe68247816e5c7427cacb74f8aa10605a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p6krf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-697fb699cf-cg5z2_openstack-operators(c1ed3c21-a6cb-43f6-a018-8bead69a5439): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:20:13 crc kubenswrapper[4781]: E1208 20:20:13.756933 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 08 20:20:13 crc kubenswrapper[4781]: E1208 20:20:13.757141 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9kxcx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-zkslx_openstack-operators(3b159a78-5c17-430a-ac90-f1d4e7fac757): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:20:14 crc kubenswrapper[4781]: E1208 20:20:14.566107 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87" Dec 08 20:20:14 crc kubenswrapper[4781]: E1208 20:20:14.566490 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qvs2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-967d97867-cv24b_openstack-operators(97f1308f-60a9-4e7f-b029-8bc13246ba9e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:20:15 crc kubenswrapper[4781]: E1208 20:20:15.232769 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 08 20:20:15 crc kubenswrapper[4781]: E1208 20:20:15.232978 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mvlkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-rdrrr_openstack-operators(b86b0a68-3152-46ac-8bde-3bfd32c6fbf2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:20:15 crc kubenswrapper[4781]: I1208 20:20:15.831031 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5b2dv"] Dec 08 20:20:15 crc kubenswrapper[4781]: I1208 20:20:15.832973 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:15 crc kubenswrapper[4781]: I1208 20:20:15.842046 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b2dv"] Dec 08 20:20:15 crc kubenswrapper[4781]: E1208 20:20:15.907313 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027" Dec 08 20:20:15 crc kubenswrapper[4781]: E1208 20:20:15.907486 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wj6kr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5697bb5779-64kcb_openstack-operators(b44e4d42-05a5-42e2-8a45-5d5506fbbb23): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:20:15 crc kubenswrapper[4781]: I1208 20:20:15.925601 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6111865f-840d-425c-8f1b-257dcf640da5-catalog-content\") pod \"redhat-marketplace-5b2dv\" (UID: \"6111865f-840d-425c-8f1b-257dcf640da5\") " pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:15 crc kubenswrapper[4781]: I1208 20:20:15.925658 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6111865f-840d-425c-8f1b-257dcf640da5-utilities\") pod \"redhat-marketplace-5b2dv\" (UID: \"6111865f-840d-425c-8f1b-257dcf640da5\") " pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:15 crc kubenswrapper[4781]: I1208 20:20:15.925804 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gz4t\" (UniqueName: \"kubernetes.io/projected/6111865f-840d-425c-8f1b-257dcf640da5-kube-api-access-8gz4t\") pod \"redhat-marketplace-5b2dv\" (UID: \"6111865f-840d-425c-8f1b-257dcf640da5\") " pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:16 crc kubenswrapper[4781]: I1208 20:20:16.027596 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gz4t\" (UniqueName: \"kubernetes.io/projected/6111865f-840d-425c-8f1b-257dcf640da5-kube-api-access-8gz4t\") pod \"redhat-marketplace-5b2dv\" (UID: \"6111865f-840d-425c-8f1b-257dcf640da5\") " pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:16 crc kubenswrapper[4781]: I1208 20:20:16.027690 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6111865f-840d-425c-8f1b-257dcf640da5-catalog-content\") pod \"redhat-marketplace-5b2dv\" (UID: \"6111865f-840d-425c-8f1b-257dcf640da5\") " pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:16 crc kubenswrapper[4781]: I1208 20:20:16.027715 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6111865f-840d-425c-8f1b-257dcf640da5-utilities\") pod \"redhat-marketplace-5b2dv\" (UID: \"6111865f-840d-425c-8f1b-257dcf640da5\") " pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:16 crc kubenswrapper[4781]: I1208 20:20:16.028266 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6111865f-840d-425c-8f1b-257dcf640da5-catalog-content\") pod \"redhat-marketplace-5b2dv\" (UID: \"6111865f-840d-425c-8f1b-257dcf640da5\") " pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:16 crc kubenswrapper[4781]: I1208 20:20:16.028313 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6111865f-840d-425c-8f1b-257dcf640da5-utilities\") pod \"redhat-marketplace-5b2dv\" (UID: \"6111865f-840d-425c-8f1b-257dcf640da5\") " pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:16 crc kubenswrapper[4781]: I1208 20:20:16.055087 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gz4t\" (UniqueName: \"kubernetes.io/projected/6111865f-840d-425c-8f1b-257dcf640da5-kube-api-access-8gz4t\") pod \"redhat-marketplace-5b2dv\" (UID: \"6111865f-840d-425c-8f1b-257dcf640da5\") " pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:16 crc kubenswrapper[4781]: I1208 20:20:16.157083 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:16 crc kubenswrapper[4781]: E1208 20:20:16.627679 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 08 20:20:16 crc kubenswrapper[4781]: E1208 20:20:16.627872 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4czr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-p7f7t_openstack-operators(e00e8983-d123-42e8-a4ef-2a2bbda78cde): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:20:22 crc kubenswrapper[4781]: I1208 20:20:22.627537 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b2dv"] Dec 08 20:20:22 crc kubenswrapper[4781]: I1208 20:20:22.915966 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dh86b"] Dec 08 20:20:22 crc kubenswrapper[4781]: I1208 20:20:22.917628 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:22 crc kubenswrapper[4781]: I1208 20:20:22.934308 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dh86b"] Dec 08 20:20:22 crc kubenswrapper[4781]: I1208 20:20:22.967468 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a3a690-c4c6-47cd-838d-818956791ab5-catalog-content\") pod \"certified-operators-dh86b\" (UID: \"d4a3a690-c4c6-47cd-838d-818956791ab5\") " pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:22 crc kubenswrapper[4781]: I1208 20:20:22.967588 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9wlg\" (UniqueName: \"kubernetes.io/projected/d4a3a690-c4c6-47cd-838d-818956791ab5-kube-api-access-z9wlg\") pod \"certified-operators-dh86b\" (UID: \"d4a3a690-c4c6-47cd-838d-818956791ab5\") " pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:22 crc kubenswrapper[4781]: I1208 20:20:22.967675 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a3a690-c4c6-47cd-838d-818956791ab5-utilities\") pod \"certified-operators-dh86b\" (UID: \"d4a3a690-c4c6-47cd-838d-818956791ab5\") " pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.068887 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a3a690-c4c6-47cd-838d-818956791ab5-catalog-content\") pod \"certified-operators-dh86b\" (UID: \"d4a3a690-c4c6-47cd-838d-818956791ab5\") " pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.068996 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9wlg\" (UniqueName: \"kubernetes.io/projected/d4a3a690-c4c6-47cd-838d-818956791ab5-kube-api-access-z9wlg\") pod \"certified-operators-dh86b\" (UID: \"d4a3a690-c4c6-47cd-838d-818956791ab5\") " pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.069063 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a3a690-c4c6-47cd-838d-818956791ab5-utilities\") pod \"certified-operators-dh86b\" (UID: \"d4a3a690-c4c6-47cd-838d-818956791ab5\") " pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.069349 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a3a690-c4c6-47cd-838d-818956791ab5-catalog-content\") pod \"certified-operators-dh86b\" (UID: \"d4a3a690-c4c6-47cd-838d-818956791ab5\") " pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.069491 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a3a690-c4c6-47cd-838d-818956791ab5-utilities\") pod \"certified-operators-dh86b\" (UID: \"d4a3a690-c4c6-47cd-838d-818956791ab5\") " pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.090804 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9wlg\" (UniqueName: \"kubernetes.io/projected/d4a3a690-c4c6-47cd-838d-818956791ab5-kube-api-access-z9wlg\") pod \"certified-operators-dh86b\" (UID: \"d4a3a690-c4c6-47cd-838d-818956791ab5\") " pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.237885 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.374735 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b2dv" event={"ID":"6111865f-840d-425c-8f1b-257dcf640da5","Type":"ContainerStarted","Data":"d90e066f6335e4991efd26fd5204286d7aa554aac55fa66e0d5c84ac4ebafe8f"} Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.380222 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc" event={"ID":"33445aa2-3e5b-4b50-ba7a-0f86d08dd64d","Type":"ContainerStarted","Data":"8a2a9e906cd82313cf18e41229491ae8b5ed2387db77e3434812020620e9ee66"} Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.394111 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn" event={"ID":"76fef842-95b4-47f1-9c34-a4edc70a3cbf","Type":"ContainerStarted","Data":"7db0e04b5dab393802fcb2614f30aae613c662d2a91f25ea2c2fbd4220d832a5"} Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.396973 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46" event={"ID":"db28fdab-dab1-4d5c-9447-c895523b0985","Type":"ContainerStarted","Data":"53e925303e8a77c4b4d2302138dbae88a2b2e324591263c9344b714b5f7cf3be"} Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.401293 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j" event={"ID":"b5739409-2e10-4acf-8088-99608fc2f489","Type":"ContainerStarted","Data":"8341d1246ad3ea56a8edfc4ac03d62855ddff6559847a40069a8505182df2219"} Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.408240 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5" event={"ID":"5e629e44-f8b6-410a-baa9-b076e609686c","Type":"ContainerStarted","Data":"dcaa76d90a2fd53d5aa21c0409597dd74cc15e6b811065a9189d383e13aa45d1"} Dec 08 20:20:23 crc kubenswrapper[4781]: I1208 20:20:23.411822 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn" event={"ID":"344dd244-42be-4538-92c0-ab4be8f8a093","Type":"ContainerStarted","Data":"783b814415f762b2fc801b1f27fb6c42193865c732e1fa690c290b8e583f7cb5"} Dec 08 20:20:24 crc kubenswrapper[4781]: I1208 20:20:24.438582 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" event={"ID":"78753ed6-755b-4e63-8026-50722a9637a9","Type":"ContainerStarted","Data":"844072a7e2f2a4cc2d8c3524677d5899054bf128729eeb5fa1fdf4a4320e0daf"} Dec 08 20:20:24 crc kubenswrapper[4781]: I1208 20:20:24.440218 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" event={"ID":"9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e","Type":"ContainerStarted","Data":"25a4c5dce376a349b51ac4d006012ede630da2252192deafe2df9257e937e6bf"} Dec 08 20:20:24 crc kubenswrapper[4781]: I1208 20:20:24.441323 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" event={"ID":"51e5598f-4979-4f4d-a947-323c19dd3102","Type":"ContainerStarted","Data":"bc7d0b66b3b2116b99ed2437dcad5b12a1c50aec433641f81a863503d7c86720"} Dec 08 20:20:25 crc kubenswrapper[4781]: I1208 20:20:25.253656 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5mbrr\" (UID: \"47e01596-c50b-44f5-82fb-1b6c7a005d10\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:20:25 crc kubenswrapper[4781]: I1208 20:20:25.260224 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47e01596-c50b-44f5-82fb-1b6c7a005d10-cert\") pod \"infra-operator-controller-manager-78d48bff9d-5mbrr\" (UID: \"47e01596-c50b-44f5-82fb-1b6c7a005d10\") " pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:20:25 crc kubenswrapper[4781]: I1208 20:20:25.410054 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:20:25 crc kubenswrapper[4781]: I1208 20:20:25.565847 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert\") pod \"openstack-baremetal-operator-controller-manager-744f8cb766hj2j5\" (UID: \"0bafa66c-7cf8-40eb-ae15-a4365fbe3176\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:20:25 crc kubenswrapper[4781]: I1208 20:20:25.570855 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bafa66c-7cf8-40eb-ae15-a4365fbe3176-cert\") pod \"openstack-baremetal-operator-controller-manager-744f8cb766hj2j5\" (UID: \"0bafa66c-7cf8-40eb-ae15-a4365fbe3176\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:20:25 crc kubenswrapper[4781]: I1208 20:20:25.727672 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:20:25 crc kubenswrapper[4781]: I1208 20:20:25.929863 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:20:25 crc kubenswrapper[4781]: I1208 20:20:25.929975 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:20:25 crc kubenswrapper[4781]: I1208 20:20:25.989987 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-webhook-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.007596 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f03e65d-f0b8-4cfa-90bd-4a70de607c2d-metrics-certs\") pod \"openstack-operator-controller-manager-5b47684954-hcdhf\" (UID: \"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d\") " pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.015209 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.445607 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bcc4x"] Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.447645 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.501338 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcc4x"] Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.505798 4781 generic.go:334] "Generic (PLEG): container finished" podID="6111865f-840d-425c-8f1b-257dcf640da5" containerID="17e07d77a7c656e27c98991e1a0b82d2498698fa5eefdb9d0be3c259f6cc318c" exitCode=0 Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.505946 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b2dv" event={"ID":"6111865f-840d-425c-8f1b-257dcf640da5","Type":"ContainerDied","Data":"17e07d77a7c656e27c98991e1a0b82d2498698fa5eefdb9d0be3c259f6cc318c"} Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.510822 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" event={"ID":"7f697324-2ce1-4c81-89c8-9cc53bac7062","Type":"ContainerStarted","Data":"e29a7b5be85ccfc444440beabbb783643ce7b012572d77a69c19fc2f89b9e424"} Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.547067 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f5fcf6-2249-49dc-a4a1-18b746f370d0-catalog-content\") pod \"community-operators-bcc4x\" (UID: \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\") " pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.547121 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f5fcf6-2249-49dc-a4a1-18b746f370d0-utilities\") pod \"community-operators-bcc4x\" (UID: \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\") " pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.547210 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n964n\" (UniqueName: \"kubernetes.io/projected/15f5fcf6-2249-49dc-a4a1-18b746f370d0-kube-api-access-n964n\") pod \"community-operators-bcc4x\" (UID: \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\") " pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.648603 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n964n\" (UniqueName: \"kubernetes.io/projected/15f5fcf6-2249-49dc-a4a1-18b746f370d0-kube-api-access-n964n\") pod \"community-operators-bcc4x\" (UID: \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\") " pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.648700 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f5fcf6-2249-49dc-a4a1-18b746f370d0-catalog-content\") pod \"community-operators-bcc4x\" (UID: \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\") " pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.648740 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f5fcf6-2249-49dc-a4a1-18b746f370d0-utilities\") pod \"community-operators-bcc4x\" (UID: \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\") " pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.649354 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f5fcf6-2249-49dc-a4a1-18b746f370d0-catalog-content\") pod \"community-operators-bcc4x\" (UID: \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\") " pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.649640 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f5fcf6-2249-49dc-a4a1-18b746f370d0-utilities\") pod \"community-operators-bcc4x\" (UID: \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\") " pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.692537 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n964n\" (UniqueName: \"kubernetes.io/projected/15f5fcf6-2249-49dc-a4a1-18b746f370d0-kube-api-access-n964n\") pod \"community-operators-bcc4x\" (UID: \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\") " pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:26 crc kubenswrapper[4781]: I1208 20:20:26.804556 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:27 crc kubenswrapper[4781]: E1208 20:20:27.039710 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb" podUID="b44e4d42-05a5-42e2-8a45-5d5506fbbb23" Dec 08 20:20:27 crc kubenswrapper[4781]: I1208 20:20:27.236593 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr"] Dec 08 20:20:27 crc kubenswrapper[4781]: I1208 20:20:27.307247 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5"] Dec 08 20:20:27 crc kubenswrapper[4781]: I1208 20:20:27.316430 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf"] Dec 08 20:20:27 crc kubenswrapper[4781]: E1208 20:20:27.324781 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g" podUID="17686434-377d-4a2f-b25e-e0074d2e06c6" Dec 08 20:20:27 crc kubenswrapper[4781]: W1208 20:20:27.381153 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f03e65d_f0b8_4cfa_90bd_4a70de607c2d.slice/crio-0b940fa710288e7fc173faa577016ba3768db311ccec8f170e238e24e6a4a767 WatchSource:0}: Error finding container 0b940fa710288e7fc173faa577016ba3768db311ccec8f170e238e24e6a4a767: Status 404 returned error can't find the container with id 0b940fa710288e7fc173faa577016ba3768db311ccec8f170e238e24e6a4a767 Dec 08 20:20:27 crc kubenswrapper[4781]: E1208 20:20:27.417705 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2" podUID="c1ed3c21-a6cb-43f6-a018-8bead69a5439" Dec 08 20:20:27 crc kubenswrapper[4781]: I1208 20:20:27.539664 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2" event={"ID":"c1ed3c21-a6cb-43f6-a018-8bead69a5439","Type":"ContainerStarted","Data":"773f9407c5683bc88efd0ccf8a88f1aab01f1e334a9b580b6bb86d641d23d1b8"} Dec 08 20:20:27 crc kubenswrapper[4781]: I1208 20:20:27.591529 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g" event={"ID":"17686434-377d-4a2f-b25e-e0074d2e06c6","Type":"ContainerStarted","Data":"bcd3344e05a9ef1e3cc90876dfcfa29ba7dd8dcd7aa30c075af216a2d8546558"} Dec 08 20:20:27 crc kubenswrapper[4781]: I1208 20:20:27.618544 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" event={"ID":"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d","Type":"ContainerStarted","Data":"0b940fa710288e7fc173faa577016ba3768db311ccec8f170e238e24e6a4a767"} Dec 08 20:20:27 crc kubenswrapper[4781]: I1208 20:20:27.620679 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" event={"ID":"0bafa66c-7cf8-40eb-ae15-a4365fbe3176","Type":"ContainerStarted","Data":"86a1b6853d376a873422b66306d315ce15f8a5bbc0ef21c93d80d4aaf8dabc3a"} Dec 08 20:20:27 crc kubenswrapper[4781]: I1208 20:20:27.623965 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dh86b"] Dec 08 20:20:27 crc kubenswrapper[4781]: E1208 20:20:27.628993 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns" podUID="2c1ab608-2b41-451a-b6b9-2cf867ab289b" Dec 08 20:20:27 crc kubenswrapper[4781]: I1208 20:20:27.668275 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9" event={"ID":"60bdaea6-28dd-4dac-b1b8-a046ea0c90e0","Type":"ContainerStarted","Data":"5a7a71e39721241321f40646a6f0b0e64e94d7a988ccd495425e54282634315b"} Dec 08 20:20:27 crc kubenswrapper[4781]: I1208 20:20:27.674991 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" event={"ID":"47e01596-c50b-44f5-82fb-1b6c7a005d10","Type":"ContainerStarted","Data":"3794d7ce5ec591150a09532613a35c1993bd2e6dd4206cd4842b15ed05f0bf07"} Dec 08 20:20:27 crc kubenswrapper[4781]: I1208 20:20:27.684166 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" event={"ID":"908c60a3-663f-4a4a-9223-fbcff50de2b9","Type":"ContainerStarted","Data":"2dffcb5257b959e66faca6a371c3f000cc577ea59aff03329dda11882066eea1"} Dec 08 20:20:27 crc kubenswrapper[4781]: I1208 20:20:27.685683 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb" event={"ID":"b44e4d42-05a5-42e2-8a45-5d5506fbbb23","Type":"ContainerStarted","Data":"d733e4962325616de18dfa858da622ab79bc15d2889024c377f7c4987393d394"} Dec 08 20:20:27 crc kubenswrapper[4781]: E1208 20:20:27.733136 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t" podUID="e00e8983-d123-42e8-a4ef-2a2bbda78cde" Dec 08 20:20:27 crc kubenswrapper[4781]: E1208 20:20:27.735216 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr" podUID="b86b0a68-3152-46ac-8bde-3bfd32c6fbf2" Dec 08 20:20:27 crc kubenswrapper[4781]: I1208 20:20:27.801956 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p26p9" podStartSLOduration=6.68841262 podStartE2EDuration="33.80193344s" podCreationTimestamp="2025-12-08 20:19:54 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.636728994 +0000 UTC m=+911.788012371" lastFinishedPulling="2025-12-08 20:20:22.750249814 +0000 UTC m=+938.901533191" observedRunningTime="2025-12-08 20:20:27.792385906 +0000 UTC m=+943.943669283" watchObservedRunningTime="2025-12-08 20:20:27.80193344 +0000 UTC m=+943.953216817" Dec 08 20:20:28 crc kubenswrapper[4781]: E1208 20:20:28.040785 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-967d97867-cv24b" podUID="97f1308f-60a9-4e7f-b029-8bc13246ba9e" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.083726 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcc4x"] Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.728984 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn" event={"ID":"76fef842-95b4-47f1-9c34-a4edc70a3cbf","Type":"ContainerStarted","Data":"5ff9ecf078d58e4d8ecaa0e4b2ccf6fc83a3f510a8f28b60f7a552fb1195e19f"} Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.731349 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.746379 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc" event={"ID":"33445aa2-3e5b-4b50-ba7a-0f86d08dd64d","Type":"ContainerStarted","Data":"62e9db90cadc16dd4b2b07e55c3efa50eb153f12583604d8c8891de248bdafc0"} Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.747494 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.761445 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.762187 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.773966 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-bkrvn" podStartSLOduration=3.383677668 podStartE2EDuration="35.773939782s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.0585958 +0000 UTC m=+911.209879177" lastFinishedPulling="2025-12-08 20:20:27.448857914 +0000 UTC m=+943.600141291" observedRunningTime="2025-12-08 20:20:28.762318788 +0000 UTC m=+944.913602165" watchObservedRunningTime="2025-12-08 20:20:28.773939782 +0000 UTC m=+944.925223169" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.782875 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46" event={"ID":"db28fdab-dab1-4d5c-9447-c895523b0985","Type":"ContainerStarted","Data":"aa5ecc9b27d317453a56355670954dec3877a9d0bab1c22ac6b2d7b2b885a17c"} Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.783936 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.792776 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.794369 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j" event={"ID":"b5739409-2e10-4acf-8088-99608fc2f489","Type":"ContainerStarted","Data":"83b164c193ab2942ce94bb06af375d332018a592344e4f6b696efb5ffb2e8f5b"} Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.795419 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.799380 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.821386 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t" event={"ID":"e00e8983-d123-42e8-a4ef-2a2bbda78cde","Type":"ContainerStarted","Data":"d1d627ee6f7147c9258cc69835785e4cd77dd7a4bf63d23a0f14fda3e9b60ff5"} Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.830061 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh86b" event={"ID":"d4a3a690-c4c6-47cd-838d-818956791ab5","Type":"ContainerStarted","Data":"4eb179634afbdb24086619c42f14069d55879f6f3823f0d68bb955ce6dfcc4a5"} Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.840836 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-spkzc" podStartSLOduration=3.619869745 podStartE2EDuration="35.840812653s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.172633547 +0000 UTC m=+911.323916924" lastFinishedPulling="2025-12-08 20:20:27.393576455 +0000 UTC m=+943.544859832" observedRunningTime="2025-12-08 20:20:28.836262833 +0000 UTC m=+944.987546220" watchObservedRunningTime="2025-12-08 20:20:28.840812653 +0000 UTC m=+944.992096030" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.857520 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" event={"ID":"51e5598f-4979-4f4d-a947-323c19dd3102","Type":"ContainerStarted","Data":"9602f17cb4aee9eca43873ddc46e6365a786f653d0d7a5c62731a44d05243adb"} Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.858192 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.878800 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns" event={"ID":"2c1ab608-2b41-451a-b6b9-2cf867ab289b","Type":"ContainerStarted","Data":"1733664d91654971af89035a51c3ba2a6efec2c943c3c36a14c926a517e31804"} Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.886034 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-667bd8d554-fcn46" podStartSLOduration=4.102378921 podStartE2EDuration="35.886014872s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.613456315 +0000 UTC m=+911.764739692" lastFinishedPulling="2025-12-08 20:20:27.397092266 +0000 UTC m=+943.548375643" observedRunningTime="2025-12-08 20:20:28.885049825 +0000 UTC m=+945.036333212" watchObservedRunningTime="2025-12-08 20:20:28.886014872 +0000 UTC m=+945.037298249" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.894725 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5" event={"ID":"5e629e44-f8b6-410a-baa9-b076e609686c","Type":"ContainerStarted","Data":"111d04976d416e2ce64895522b880ecb067e635d75549272c95142a4d04c0582"} Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.895308 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.901413 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.903654 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" event={"ID":"908c60a3-663f-4a4a-9223-fbcff50de2b9","Type":"ContainerStarted","Data":"ad4392dc1e8b61fa9ccee4f8d5341e4da481bc59631afd7268f31a496172f85e"} Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.903823 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.943871 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" event={"ID":"9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e","Type":"ContainerStarted","Data":"a70c0cb5726205d1da5fbf7fbe4b1b2c900eebffad5040940873fb7dfbe29328"} Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.944608 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.950911 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" podStartSLOduration=4.435825663 podStartE2EDuration="35.950899167s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.640372648 +0000 UTC m=+911.791656025" lastFinishedPulling="2025-12-08 20:20:27.155446152 +0000 UTC m=+943.306729529" observedRunningTime="2025-12-08 20:20:28.948193909 +0000 UTC m=+945.099477286" watchObservedRunningTime="2025-12-08 20:20:28.950899167 +0000 UTC m=+945.102182544" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.951263 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr" event={"ID":"b86b0a68-3152-46ac-8bde-3bfd32c6fbf2","Type":"ContainerStarted","Data":"768c0360d846a5b80f6abdace023c2f94fb495a844182131c53cb7f0eea2be33"} Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.951777 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-dnh9j" podStartSLOduration=4.41660436 podStartE2EDuration="35.951771792s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.618935262 +0000 UTC m=+911.770218639" lastFinishedPulling="2025-12-08 20:20:27.154102694 +0000 UTC m=+943.305386071" observedRunningTime="2025-12-08 20:20:28.923614333 +0000 UTC m=+945.074897710" watchObservedRunningTime="2025-12-08 20:20:28.951771792 +0000 UTC m=+945.103055159" Dec 08 20:20:28 crc kubenswrapper[4781]: I1208 20:20:28.967328 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcc4x" event={"ID":"15f5fcf6-2249-49dc-a4a1-18b746f370d0","Type":"ContainerStarted","Data":"adb09ad8b41cae80524c02b624ec18f4b502c66fd4ecb4960863d27e0e6d4da6"} Dec 08 20:20:29 crc kubenswrapper[4781]: I1208 20:20:28.997943 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" podStartSLOduration=9.842944612 podStartE2EDuration="35.997908638s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.636208099 +0000 UTC m=+911.787491476" lastFinishedPulling="2025-12-08 20:20:21.791172125 +0000 UTC m=+937.942455502" observedRunningTime="2025-12-08 20:20:28.993574693 +0000 UTC m=+945.144858080" watchObservedRunningTime="2025-12-08 20:20:28.997908638 +0000 UTC m=+945.149192035" Dec 08 20:20:29 crc kubenswrapper[4781]: I1208 20:20:28.998867 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" event={"ID":"0f03e65d-f0b8-4cfa-90bd-4a70de607c2d","Type":"ContainerStarted","Data":"e501f9046d22aa9b932e02b127bea34519b3b1359c989122619b5e2a56531fcc"} Dec 08 20:20:29 crc kubenswrapper[4781]: I1208 20:20:28.998908 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:20:29 crc kubenswrapper[4781]: E1208 20:20:29.018765 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx" podUID="3b159a78-5c17-430a-ac90-f1d4e7fac757" Dec 08 20:20:29 crc kubenswrapper[4781]: I1208 20:20:29.025296 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-cv24b" event={"ID":"97f1308f-60a9-4e7f-b029-8bc13246ba9e","Type":"ContainerStarted","Data":"1c633624a05292cac1c8e2641d34ffa953d072eb4cd0ab745ee67e771b2de766"} Dec 08 20:20:29 crc kubenswrapper[4781]: I1208 20:20:29.032254 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-7dhp5" podStartSLOduration=3.730496954 podStartE2EDuration="36.032233843s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.062323718 +0000 UTC m=+911.213607085" lastFinishedPulling="2025-12-08 20:20:27.364060597 +0000 UTC m=+943.515343974" observedRunningTime="2025-12-08 20:20:29.028756943 +0000 UTC m=+945.180040320" watchObservedRunningTime="2025-12-08 20:20:29.032233843 +0000 UTC m=+945.183517220" Dec 08 20:20:29 crc kubenswrapper[4781]: I1208 20:20:29.134829 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" podStartSLOduration=4.305739224 podStartE2EDuration="36.134812231s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.620094636 +0000 UTC m=+911.771378013" lastFinishedPulling="2025-12-08 20:20:27.449167643 +0000 UTC m=+943.600451020" observedRunningTime="2025-12-08 20:20:29.132515605 +0000 UTC m=+945.283798982" watchObservedRunningTime="2025-12-08 20:20:29.134812231 +0000 UTC m=+945.286095608" Dec 08 20:20:29 crc kubenswrapper[4781]: I1208 20:20:29.250755 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" podStartSLOduration=36.250732602 podStartE2EDuration="36.250732602s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:20:29.246610523 +0000 UTC m=+945.397893910" watchObservedRunningTime="2025-12-08 20:20:29.250732602 +0000 UTC m=+945.402015979" Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.051815 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn" event={"ID":"344dd244-42be-4538-92c0-ab4be8f8a093","Type":"ContainerStarted","Data":"c884385d9158b1326070059d8c91838fcc746f274085f3ce9374097d015b155c"} Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.052272 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn" Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.055545 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn" Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.056965 4781 generic.go:334] "Generic (PLEG): container finished" podID="6111865f-840d-425c-8f1b-257dcf640da5" containerID="020fd5f3b68c9caf8adb1b4ce22e1aaca0831e138073f4a586ea4d4edfa31ac8" exitCode=0 Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.057011 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b2dv" event={"ID":"6111865f-840d-425c-8f1b-257dcf640da5","Type":"ContainerDied","Data":"020fd5f3b68c9caf8adb1b4ce22e1aaca0831e138073f4a586ea4d4edfa31ac8"} Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.060856 4781 generic.go:334] "Generic (PLEG): container finished" podID="15f5fcf6-2249-49dc-a4a1-18b746f370d0" containerID="003807ceeae087d8dcf2baf2accb30e31635ff19a8d891c20f2a8797a6bd527e" exitCode=0 Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.060955 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcc4x" event={"ID":"15f5fcf6-2249-49dc-a4a1-18b746f370d0","Type":"ContainerDied","Data":"003807ceeae087d8dcf2baf2accb30e31635ff19a8d891c20f2a8797a6bd527e"} Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.064745 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" event={"ID":"78753ed6-755b-4e63-8026-50722a9637a9","Type":"ContainerStarted","Data":"c63ee55f6bb3cc78642b97883ffb8ba8a8a3ce7d11ea0230197326c5795d889e"} Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.065183 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.067292 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" event={"ID":"7f697324-2ce1-4c81-89c8-9cc53bac7062","Type":"ContainerStarted","Data":"bd60fef46531cfd946115d0b2d6a966bb3b80de96689b6224d3f50b06f67e5c8"} Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.067951 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.071587 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.072064 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx" event={"ID":"3b159a78-5c17-430a-ac90-f1d4e7fac757","Type":"ContainerStarted","Data":"31e9f2b2dd3967fdf943d20bfeef8cf9ef1efd9529bae71dbc5c2a9d70ca6acc"} Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.078757 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.079893 4781 generic.go:334] "Generic (PLEG): container finished" podID="d4a3a690-c4c6-47cd-838d-818956791ab5" containerID="bfe11b1622985f84aea6af355fb76dede023e6af291ff407e6144d9d4e14a563" exitCode=0 Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.080079 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh86b" event={"ID":"d4a3a690-c4c6-47cd-838d-818956791ab5","Type":"ContainerDied","Data":"bfe11b1622985f84aea6af355fb76dede023e6af291ff407e6144d9d4e14a563"} Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.083356 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fjhrn" podStartSLOduration=4.882608341 podStartE2EDuration="37.083341728s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.094992536 +0000 UTC m=+911.246275913" lastFinishedPulling="2025-12-08 20:20:27.295725923 +0000 UTC m=+943.447009300" observedRunningTime="2025-12-08 20:20:30.079085956 +0000 UTC m=+946.230369353" watchObservedRunningTime="2025-12-08 20:20:30.083341728 +0000 UTC m=+946.234625105" Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.085054 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9d58d64bc-24k5k" Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.085095 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5b5fd79c9c-swkcz" Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.134390 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-fql79" podStartSLOduration=5.118607352 podStartE2EDuration="37.134374294s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.619875959 +0000 UTC m=+911.771159336" lastFinishedPulling="2025-12-08 20:20:27.635642901 +0000 UTC m=+943.786926278" observedRunningTime="2025-12-08 20:20:30.134099046 +0000 UTC m=+946.285382433" watchObservedRunningTime="2025-12-08 20:20:30.134374294 +0000 UTC m=+946.285657671" Dec 08 20:20:30 crc kubenswrapper[4781]: I1208 20:20:30.179688 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-58d5ff84df-j5vws" podStartSLOduration=5.402711557 podStartE2EDuration="37.179671396s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.632308297 +0000 UTC m=+911.783591674" lastFinishedPulling="2025-12-08 20:20:27.409268136 +0000 UTC m=+943.560551513" observedRunningTime="2025-12-08 20:20:30.176937337 +0000 UTC m=+946.328220714" watchObservedRunningTime="2025-12-08 20:20:30.179671396 +0000 UTC m=+946.330954773" Dec 08 20:20:34 crc kubenswrapper[4781]: I1208 20:20:34.280122 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-2b4gp" Dec 08 20:20:36 crc kubenswrapper[4781]: I1208 20:20:36.020350 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5b47684954-hcdhf" Dec 08 20:20:37 crc kubenswrapper[4781]: I1208 20:20:37.129704 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb" event={"ID":"b44e4d42-05a5-42e2-8a45-5d5506fbbb23","Type":"ContainerStarted","Data":"95fd171c96bf3675ca360cf0bd687a3de4a90b6de3e98e2f633b22768af286dc"} Dec 08 20:20:37 crc kubenswrapper[4781]: I1208 20:20:37.131553 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2" event={"ID":"c1ed3c21-a6cb-43f6-a018-8bead69a5439","Type":"ContainerStarted","Data":"e1c042ce062f5fd4126da23a942c1f1f332fe829d7f75e3fb002471a291f57d9"} Dec 08 20:20:38 crc kubenswrapper[4781]: I1208 20:20:38.141236 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g" event={"ID":"17686434-377d-4a2f-b25e-e0074d2e06c6","Type":"ContainerStarted","Data":"a8a69c93e65d73924056022c0a15eaeeb7001dc8fcad59c9e1100aee0a3b4aea"} Dec 08 20:20:38 crc kubenswrapper[4781]: I1208 20:20:38.141460 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g" Dec 08 20:20:38 crc kubenswrapper[4781]: I1208 20:20:38.141625 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2" Dec 08 20:20:38 crc kubenswrapper[4781]: I1208 20:20:38.166200 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g" podStartSLOduration=10.570068345 podStartE2EDuration="45.166172726s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:54.21920673 +0000 UTC m=+910.370490107" lastFinishedPulling="2025-12-08 20:20:28.815311111 +0000 UTC m=+944.966594488" observedRunningTime="2025-12-08 20:20:38.156535189 +0000 UTC m=+954.307818566" watchObservedRunningTime="2025-12-08 20:20:38.166172726 +0000 UTC m=+954.317456113" Dec 08 20:20:38 crc kubenswrapper[4781]: I1208 20:20:38.179497 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2" podStartSLOduration=11.102852335 podStartE2EDuration="45.179481088s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:54.739700077 +0000 UTC m=+910.890983464" lastFinishedPulling="2025-12-08 20:20:28.81632884 +0000 UTC m=+944.967612217" observedRunningTime="2025-12-08 20:20:38.17817049 +0000 UTC m=+954.329453877" watchObservedRunningTime="2025-12-08 20:20:38.179481088 +0000 UTC m=+954.330764465" Dec 08 20:20:38 crc kubenswrapper[4781]: I1208 20:20:38.193394 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb" podStartSLOduration=11.508039628 podStartE2EDuration="45.193373537s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.124062822 +0000 UTC m=+911.275346199" lastFinishedPulling="2025-12-08 20:20:28.809396731 +0000 UTC m=+944.960680108" observedRunningTime="2025-12-08 20:20:38.191774181 +0000 UTC m=+954.343057578" watchObservedRunningTime="2025-12-08 20:20:38.193373537 +0000 UTC m=+954.344656914" Dec 08 20:20:42 crc kubenswrapper[4781]: E1208 20:20:42.868454 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:9d539fb6b72f91cfc6200bb91b7c6dbaeab17c7711342dd3a9549c66762a2d48" Dec 08 20:20:42 crc kubenswrapper[4781]: E1208 20:20:42.870943 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:9d539fb6b72f91cfc6200bb91b7c6dbaeab17c7711342dd3a9549c66762a2d48,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent@sha256:3ba2aa79928f91c0461022459de272a09bd40e3167ecacbd81c4f5195e2e0950,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner@sha256:2444fe898df68969a7978bb84fd12c3c61dc371f264156ff0a877d8aab1f9f4e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api@sha256:2d87021f2f291525dda4c17e8fcd2fbef60780450d7941be423bcfd4047cabd2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator@sha256:3473a5f5c914f9ba397ffc5ea9d8eeedd85d31a3c9244df7457f3c3e74eaefc4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener@sha256:c1c8f583529e123a7105ebc2249ab19267313f30138867840d1e65b9390f1886,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier@sha256:8dcd62d8f75c4dbf0afc27fa96cd481c56d8fb174fa29abafa0d29616eded790,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24@sha256:6b929971283d69f485a7d3e449fb5a3dd65d5a4de585c73419e776821d00062c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener@sha256:a76d2c46403c03704dcfe7de49454496300d60d849ee81076d8637b272043c69,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker@sha256:d2fbe075d21195b746fd27a073dbd249d38b3c4f81c30d162770a338fb87e338,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:43a24796dabde68270dbfefa107205e173fdd6a0dc701502858cadbede69da31,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:667029e1ec7e63fffa1a096f432f6160b441ba36df1bddc9066cbd1129b82009,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi@sha256:b2785dbc3ceaa930dff8068bbb8654af2e0b40a9c2632300641cb8348e9cf43d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter@sha256:7211a617ec657701ca819aa0ba28e1d5750f5bf2c1391b755cc4a48cc360b0fa,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification@sha256:f17b61f2318b74648e174d73dd31deee6c0d1434605c9f32707aedf2f4378957,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core@sha256:09b5017c95d7697e66b9c64846bc48ef5826a009cba89b956ec54561e5f4a2d1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup@sha256:0b08861590e3646584af0fc7c7d8a743a35b4f5964d6fd355f206daa9ae999ce,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler@sha256:e26fb8ad7808ca8efe268881f9229df90a755b24bd4ad5501ba3b8c5c16987a3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume@sha256:cfeb4e264c00408dee5196b06003722b6dda540a3f26d3ff90abfd795729833b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api@sha256:d530b56f4720e873c3944d1e317c91530f71ee2443b9945041e42edf792ee380,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor@sha256:1774b00fe0320e863a0175aa3ed8fdff94978e8dd30bde4a888e559d9e2700cc,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api@sha256:2f9748f10c87efbee801c70f46b3dc5c6532ca070af558a4fb45cb34dbbb6f04,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9@sha256:1d69ad383cb03ef808c1f737427c5ca2385e28a3af1861a4336b6e539b346c27,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central@sha256:112fed4b9de0ccf15011e8a3a26ce6efbbe8e7d8eb3d4153d1a1874b9bde6d68,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns@sha256:aa87158aeb1194f4940126197b912ea972fafe12ea5c1f89a07d6ccfafc16f77,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer@sha256:fcd3bf8112793023be72845ce3a984beabd5a3cb369c11252130076ed38b3770,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound@sha256:fee9fc72864ee217aace1cf11cb090ef41935841f9c60127d775dc2989330777,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker@sha256:239967aef48587f275c9636d8f89e476d909dbba57fea64d8196ddacf6817450,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr@sha256:7a0ade11985653bb8ad2646b0848eb6f7128d21d85b99551ac17f74293087a30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid@sha256:8ab175d7ee42e22e0ca1ebf98d180112428758a86ef8adccaba8f3653567f6ab,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler@sha256:581b65b646301e0fcb07582150ba63438f1353a85bf9acf1eb2acb4ce71c58bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron@sha256:d7e43361d50d1e7d4c99e499eee56aa50591855836638742666303dc59096258,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd@sha256:5b59d54dc4a23373a5172f15f5497b287422c32f5702efd1e171c3f2048c9842,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent@sha256:f3227beee5b52411de42c6a37ceda7d8f68934b4671a2d661403f8c1c0eab6d6,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:db3e3d71618c3539a2853a20f7684f016b67370157990932291b00a48fa16bd3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent@sha256:e6dfe5f67adec298afbb57aec95c9cf89b4757ccfea8d8be66ef0ffd8c58322f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent@sha256:ba46c29c79c92487b6b9f0db11a517269c6455b8b9786e9d2692f4e24e43d552,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent@sha256:4f1c6fcf33354f1cbbc914c1709310be2fa4fe0dd64e5dbf3f91d6f0634bd28f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api@sha256:ecf469bd360c2aa2e5eb57826585c19a10ebe9f683790803dc4989a46c11789e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn@sha256:d506b2ca02a16cdab757b38a86d40e0459094c7269067de89beb3edf4a50bf5e,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine@sha256:2f2aabcd1b45f9fb3034d28e9a49acac72d7917fd1bbfbbc498e69e8be0b7b2b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached@sha256:dad2336390cae6705133deefaa09c9e39512cf29133aa009006e3962c8022108,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis@sha256:21edb042683b37827463124ceb159fa316e8cf0ac6040dc464f5242300b9daad,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api@sha256:21334e97e6b4194d803a60d0ecfa33327bf248e7507683ea9dcb33a28a2ec858,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor@sha256:4deb460a113324762b3139301c6aacd48c57204d8d13eb1c387d7064ec19db0d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector@sha256:942f9cbe36d328caa5d68b398703b2be5d7b7dc2b034a72d2ae62416cb7be208,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent@sha256:9e2ae3ac44ed2495b0f4398d7419b1e8e1321bec32a0ab043aabf28aa8b33384,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe@sha256:7cb9e377fa81bbe84fcc006b27c45d56ea3d6ed2144fb9ebf5fb8df5b920d423,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent@sha256:9d930c44b5d90b140117dd05d976d10d29d93eed9a70118e594e00da64594562,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone@sha256:a7b6fa2f16a882674624b48939737e2bd95da7bef60db593a8e6e4d397fa516c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics@sha256:db384bf43222b066c378e77027a675d4cd9911107adba46c2922b3a55e10d6fb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api@sha256:68714e821f8e4e2d905d6e5bc7fb2e713a24c02db48901fb2a11d57b80f6c584,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler@sha256:4a8b11fbc23e097869f8f347e78a409b294573732987dd8fa6493888a3ff68d2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share@sha256:57007fab45f2d8fbf929d26609a2e566fbcb006e05d78ca72b9d0b71af866305,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:d9a2d8b19d3de4658dd26a2e781d00002e937738bfe6a1d0cf6c68c015085f4a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils@sha256:b0f8d8a4d29d8d4667205df4a94bacefcdd7a33981407c20bd7dd320f27308b7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api@sha256:255cc3471ee112b17da164148b0ec25678332061b5b488868b81a30e5afb5bb5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:d6189c79b326e4b09ccae1141528b03bc59b2533781a960e8f91f2a5dbb343d5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:a96d336d231eee461559cfe82b025874ce2b8652520297bc5143559694ebac58,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy@sha256:eaf80338dc065eb9c8c1f40552793c7cc2ff052c88c789f0a5d3e34099549adb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler@sha256:98a3cff4a3aae37148c4c982a0e37f21a476528cbd74734f59ae22f61fdb6fc1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api@sha256:a7089bcd0a2dbc014b29391dbd14b3fbc3ba0abd0f36bd16cb3b594cfa001464,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager@sha256:36cc3ee813bccbfb639f17896bd98028521e3cc5740a5d07f91e119729a76a69,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping@sha256:61807c42b6197326d9483d65972029117cea6d373ae913fd359993d8e12fff13,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog@sha256:61dbee4a2559eda45dadf8d2b121cd85f79043d7cb2c1a62f176261042c3e39c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker@sha256:c4652e3a9c4275470c3ef1a2e4d20a420d9c7bdd5157b0bbdaafea3fa038dcab,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:c1b8da8298ec8be0ca22c7d8ba48da103e72dfe7ed5e9427b971d31eac3a8b33,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather@sha256:854a802357b4f565a366fce3bf29b20c1b768ec4ab7e822ef52dfc2fef000d2c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi@sha256:dec5870172c510ae43ff98398260fe595288af59302709d71fc2a020763deb88,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:ebeb25c4a4ce978c741d166518070e05f0fd81c143bdc680ee1d8f5985ec8d6c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base@sha256:1e53a53dfe9b3cb757e4d666e76c8989941eb4f0b98d629a7f697a1693aacb17,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server@sha256:726a3df0e94cfdcef301fe88fa8d91972914ec2104fb6fa1d8e4c325981712a6,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd@sha256:c8e13f116261ef06b59e9034c605f68d53eb6f760426c35ee6ed3785b97b1800,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server@sha256:e554a5816081a60a0ae6fd1464c1f0a11cf2133707a4b220a023ecae7b302eed,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:f24234939afca841e46ea4d17bec959b63705ab0e75476465e777d44905c5f1b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account@sha256:4eb3a9c95f57df34ab88b952d8ad2057d60ac0aa4526a51070bea5d64e3aeeee,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container@sha256:ae1279cd0af8af3863925d149db4c514dfda0c159a8084216b7228a35f238678,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object@sha256:fcb1f8a778d8cffa0f42efdcbde01061cb3aaaccc3453e65a4b213d553ad344c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server@sha256:3c89899d53b3bca91830c259434c074f27554824a9cdcf117158c4a4329810f5,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all@sha256:a1a7ba434daff518f09d8f4075e76308402e9b7a0b5b641ac2ef721fbf88752a,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api@sha256:11e1e1a70328e425bde2197db720fcab4a6cfbf552dd01fb7778f319afee1fcd,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier@sha256:14277b17906e902b31f7257a34e3e6a69a0cc8ba12690862d5a88bd7ecc2830d,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine@sha256:f5327977eb0d69a4c7fd4a9eb00c0e975c8f9635ae45197947bf209e6c28d13b,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h6lsw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-744f8cb766hj2j5_openstack-operators(0bafa66c-7cf8-40eb-ae15-a4365fbe3176): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:20:43 crc kubenswrapper[4781]: I1208 20:20:43.452074 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6c677c69b-tk48g" Dec 08 20:20:43 crc kubenswrapper[4781]: I1208 20:20:43.473072 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-697fb699cf-cg5z2" Dec 08 20:20:43 crc kubenswrapper[4781]: I1208 20:20:43.552203 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb" Dec 08 20:20:43 crc kubenswrapper[4781]: I1208 20:20:43.554805 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5697bb5779-64kcb" Dec 08 20:20:43 crc kubenswrapper[4781]: E1208 20:20:43.774111 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:ccc60d56d8efc2e91a7d8a7131eb7e06c189c32247f2a819818c084ba2e2f2ab" Dec 08 20:20:43 crc kubenswrapper[4781]: E1208 20:20:43.774242 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:ccc60d56d8efc2e91a7d8a7131eb7e06c189c32247f2a819818c084ba2e2f2ab,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m2c89,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-78d48bff9d-5mbrr_openstack-operators(47e01596-c50b-44f5-82fb-1b6c7a005d10): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:20:44 crc kubenswrapper[4781]: I1208 20:20:44.199343 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b2dv" event={"ID":"6111865f-840d-425c-8f1b-257dcf640da5","Type":"ContainerStarted","Data":"0b43db88e69abd01a3ad492afac1e15d776625dc6dc59e32eedf767c0f55c52f"} Dec 08 20:20:44 crc kubenswrapper[4781]: I1208 20:20:44.214001 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcc4x" event={"ID":"15f5fcf6-2249-49dc-a4a1-18b746f370d0","Type":"ContainerStarted","Data":"08ec59328ffd24dc1709ce9ac49f39e59434b342b05b3fa9b9ca83ecab7c7eac"} Dec 08 20:20:44 crc kubenswrapper[4781]: I1208 20:20:44.234050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t" event={"ID":"e00e8983-d123-42e8-a4ef-2a2bbda78cde","Type":"ContainerStarted","Data":"65808ba2e510910ed191cc40763fe8e67c674de7b65c3178e21f2200797dc9e7"} Dec 08 20:20:44 crc kubenswrapper[4781]: I1208 20:20:44.234899 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t" Dec 08 20:20:44 crc kubenswrapper[4781]: I1208 20:20:44.344731 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5b2dv" podStartSLOduration=12.933416758 podStartE2EDuration="29.344695541s" podCreationTimestamp="2025-12-08 20:20:15 +0000 UTC" firstStartedPulling="2025-12-08 20:20:26.519061745 +0000 UTC m=+942.670345132" lastFinishedPulling="2025-12-08 20:20:42.930340538 +0000 UTC m=+959.081623915" observedRunningTime="2025-12-08 20:20:44.343512507 +0000 UTC m=+960.494795884" watchObservedRunningTime="2025-12-08 20:20:44.344695541 +0000 UTC m=+960.495978928" Dec 08 20:20:44 crc kubenswrapper[4781]: E1208 20:20:44.449363 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" podUID="47e01596-c50b-44f5-82fb-1b6c7a005d10" Dec 08 20:20:44 crc kubenswrapper[4781]: I1208 20:20:44.519452 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t" podStartSLOduration=3.323045646 podStartE2EDuration="51.519428482s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.595158949 +0000 UTC m=+911.746442326" lastFinishedPulling="2025-12-08 20:20:43.791541785 +0000 UTC m=+959.942825162" observedRunningTime="2025-12-08 20:20:44.518400852 +0000 UTC m=+960.669684229" watchObservedRunningTime="2025-12-08 20:20:44.519428482 +0000 UTC m=+960.670711859" Dec 08 20:20:44 crc kubenswrapper[4781]: E1208 20:20:44.543887 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" podUID="0bafa66c-7cf8-40eb-ae15-a4365fbe3176" Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.255395 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh86b" event={"ID":"d4a3a690-c4c6-47cd-838d-818956791ab5","Type":"ContainerStarted","Data":"38e59d5338a3e2ff4d349c5bc3a55d8519cbcd58da6a103fedd38e4a3d49e8d8"} Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.256813 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-967d97867-cv24b" event={"ID":"97f1308f-60a9-4e7f-b029-8bc13246ba9e","Type":"ContainerStarted","Data":"145dec1d7478362e2e0da60b509ff3a0850c0de69635801587be16743ee13110"} Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.256886 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-967d97867-cv24b" Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.259871 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" event={"ID":"0bafa66c-7cf8-40eb-ae15-a4365fbe3176","Type":"ContainerStarted","Data":"a198e2900bf3e29ec6018b640638e5c67b76d92d9c188f9c6fe02848090347f6"} Dec 08 20:20:45 crc kubenswrapper[4781]: E1208 20:20:45.262146 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:9d539fb6b72f91cfc6200bb91b7c6dbaeab17c7711342dd3a9549c66762a2d48\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" podUID="0bafa66c-7cf8-40eb-ae15-a4365fbe3176" Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.263897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx" event={"ID":"3b159a78-5c17-430a-ac90-f1d4e7fac757","Type":"ContainerStarted","Data":"4cdc803fee0c5e2514eb49a4807c8f5e666e833c05faea2a79d1300886d0d0fa"} Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.264031 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx" Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.265684 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr" event={"ID":"b86b0a68-3152-46ac-8bde-3bfd32c6fbf2","Type":"ContainerStarted","Data":"1896fe20cfa77650ed7a074a5229baacb3d95673d6b93d23fadbb2cfe0009e7b"} Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.265964 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr" Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.267423 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns" event={"ID":"2c1ab608-2b41-451a-b6b9-2cf867ab289b","Type":"ContainerStarted","Data":"bb95189dab79002f7d1c54ee0eef49d1addf4a9e63c3bd80a3ae71fce7b3fc50"} Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.267548 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns" Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.269402 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" event={"ID":"47e01596-c50b-44f5-82fb-1b6c7a005d10","Type":"ContainerStarted","Data":"f786724e62c54e80aaab1eaa3edca6df5e1236b6fd07a742ccc7b481f4021418"} Dec 08 20:20:45 crc kubenswrapper[4781]: E1208 20:20:45.271426 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:ccc60d56d8efc2e91a7d8a7131eb7e06c189c32247f2a819818c084ba2e2f2ab\\\"\"" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" podUID="47e01596-c50b-44f5-82fb-1b6c7a005d10" Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.303203 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-967d97867-cv24b" podStartSLOduration=3.708788571 podStartE2EDuration="52.303185584s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.191196491 +0000 UTC m=+911.342479868" lastFinishedPulling="2025-12-08 20:20:43.785593504 +0000 UTC m=+959.936876881" observedRunningTime="2025-12-08 20:20:45.295620177 +0000 UTC m=+961.446903554" watchObservedRunningTime="2025-12-08 20:20:45.303185584 +0000 UTC m=+961.454468961" Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.358330 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx" podStartSLOduration=4.175934945 podStartE2EDuration="52.358311968s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.596608491 +0000 UTC m=+911.747891868" lastFinishedPulling="2025-12-08 20:20:43.778985514 +0000 UTC m=+959.930268891" observedRunningTime="2025-12-08 20:20:45.346094707 +0000 UTC m=+961.497378084" watchObservedRunningTime="2025-12-08 20:20:45.358311968 +0000 UTC m=+961.509595345" Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.374449 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr" podStartSLOduration=4.175516832 podStartE2EDuration="52.374432381s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.595091387 +0000 UTC m=+911.746374764" lastFinishedPulling="2025-12-08 20:20:43.794006936 +0000 UTC m=+959.945290313" observedRunningTime="2025-12-08 20:20:45.372721372 +0000 UTC m=+961.524004749" watchObservedRunningTime="2025-12-08 20:20:45.374432381 +0000 UTC m=+961.525715758" Dec 08 20:20:45 crc kubenswrapper[4781]: I1208 20:20:45.467251 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns" podStartSLOduration=3.931386837 podStartE2EDuration="52.467230098s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:19:55.256418005 +0000 UTC m=+911.407701382" lastFinishedPulling="2025-12-08 20:20:43.792261266 +0000 UTC m=+959.943544643" observedRunningTime="2025-12-08 20:20:45.466135966 +0000 UTC m=+961.617419343" watchObservedRunningTime="2025-12-08 20:20:45.467230098 +0000 UTC m=+961.618513485" Dec 08 20:20:46 crc kubenswrapper[4781]: I1208 20:20:46.159064 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:46 crc kubenswrapper[4781]: I1208 20:20:46.159448 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:46 crc kubenswrapper[4781]: I1208 20:20:46.281056 4781 generic.go:334] "Generic (PLEG): container finished" podID="15f5fcf6-2249-49dc-a4a1-18b746f370d0" containerID="08ec59328ffd24dc1709ce9ac49f39e59434b342b05b3fa9b9ca83ecab7c7eac" exitCode=0 Dec 08 20:20:46 crc kubenswrapper[4781]: I1208 20:20:46.281129 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcc4x" event={"ID":"15f5fcf6-2249-49dc-a4a1-18b746f370d0","Type":"ContainerDied","Data":"08ec59328ffd24dc1709ce9ac49f39e59434b342b05b3fa9b9ca83ecab7c7eac"} Dec 08 20:20:46 crc kubenswrapper[4781]: I1208 20:20:46.283230 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 20:20:46 crc kubenswrapper[4781]: I1208 20:20:46.283444 4781 generic.go:334] "Generic (PLEG): container finished" podID="d4a3a690-c4c6-47cd-838d-818956791ab5" containerID="38e59d5338a3e2ff4d349c5bc3a55d8519cbcd58da6a103fedd38e4a3d49e8d8" exitCode=0 Dec 08 20:20:46 crc kubenswrapper[4781]: I1208 20:20:46.283599 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh86b" event={"ID":"d4a3a690-c4c6-47cd-838d-818956791ab5","Type":"ContainerDied","Data":"38e59d5338a3e2ff4d349c5bc3a55d8519cbcd58da6a103fedd38e4a3d49e8d8"} Dec 08 20:20:46 crc kubenswrapper[4781]: E1208 20:20:46.285407 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:9d539fb6b72f91cfc6200bb91b7c6dbaeab17c7711342dd3a9549c66762a2d48\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" podUID="0bafa66c-7cf8-40eb-ae15-a4365fbe3176" Dec 08 20:20:46 crc kubenswrapper[4781]: E1208 20:20:46.285510 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:ccc60d56d8efc2e91a7d8a7131eb7e06c189c32247f2a819818c084ba2e2f2ab\\\"\"" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" podUID="47e01596-c50b-44f5-82fb-1b6c7a005d10" Dec 08 20:20:47 crc kubenswrapper[4781]: I1208 20:20:47.207824 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-5b2dv" podUID="6111865f-840d-425c-8f1b-257dcf640da5" containerName="registry-server" probeResult="failure" output=< Dec 08 20:20:47 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 08 20:20:47 crc kubenswrapper[4781]: > Dec 08 20:20:47 crc kubenswrapper[4781]: I1208 20:20:47.294446 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcc4x" event={"ID":"15f5fcf6-2249-49dc-a4a1-18b746f370d0","Type":"ContainerStarted","Data":"2923acdbe8d2af5de89f57da7fec7be055e57bb8055936c1dc6e28986920d999"} Dec 08 20:20:47 crc kubenswrapper[4781]: I1208 20:20:47.297631 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh86b" event={"ID":"d4a3a690-c4c6-47cd-838d-818956791ab5","Type":"ContainerStarted","Data":"eeaa29b9b7051ab78bb1897fed9cfe53b3d65b60298d3f992e6e72c50990b62c"} Dec 08 20:20:47 crc kubenswrapper[4781]: I1208 20:20:47.316662 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bcc4x" podStartSLOduration=4.6798982 podStartE2EDuration="21.316640752s" podCreationTimestamp="2025-12-08 20:20:26 +0000 UTC" firstStartedPulling="2025-12-08 20:20:30.06218577 +0000 UTC m=+946.213469147" lastFinishedPulling="2025-12-08 20:20:46.698928322 +0000 UTC m=+962.850211699" observedRunningTime="2025-12-08 20:20:47.313862102 +0000 UTC m=+963.465145489" watchObservedRunningTime="2025-12-08 20:20:47.316640752 +0000 UTC m=+963.467924129" Dec 08 20:20:47 crc kubenswrapper[4781]: I1208 20:20:47.339633 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dh86b" podStartSLOduration=8.747564314 podStartE2EDuration="25.339613432s" podCreationTimestamp="2025-12-08 20:20:22 +0000 UTC" firstStartedPulling="2025-12-08 20:20:30.082216355 +0000 UTC m=+946.233499752" lastFinishedPulling="2025-12-08 20:20:46.674265493 +0000 UTC m=+962.825548870" observedRunningTime="2025-12-08 20:20:47.33815941 +0000 UTC m=+963.489442787" watchObservedRunningTime="2025-12-08 20:20:47.339613432 +0000 UTC m=+963.490896819" Dec 08 20:20:53 crc kubenswrapper[4781]: I1208 20:20:53.238539 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:53 crc kubenswrapper[4781]: I1208 20:20:53.239106 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:53 crc kubenswrapper[4781]: I1208 20:20:53.309268 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:53 crc kubenswrapper[4781]: I1208 20:20:53.391717 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:53 crc kubenswrapper[4781]: I1208 20:20:53.542026 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dh86b"] Dec 08 20:20:53 crc kubenswrapper[4781]: I1208 20:20:53.626607 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-967d97867-cv24b" Dec 08 20:20:54 crc kubenswrapper[4781]: I1208 20:20:54.001260 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-zkslx" Dec 08 20:20:54 crc kubenswrapper[4781]: I1208 20:20:54.031274 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-79c8c4686c-m9tns" Dec 08 20:20:54 crc kubenswrapper[4781]: I1208 20:20:54.038178 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-p7f7t" Dec 08 20:20:54 crc kubenswrapper[4781]: I1208 20:20:54.137338 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-rdrrr" Dec 08 20:20:55 crc kubenswrapper[4781]: I1208 20:20:55.386297 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dh86b" podUID="d4a3a690-c4c6-47cd-838d-818956791ab5" containerName="registry-server" containerID="cri-o://eeaa29b9b7051ab78bb1897fed9cfe53b3d65b60298d3f992e6e72c50990b62c" gracePeriod=2 Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.218624 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.265711 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.302486 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.394434 4781 generic.go:334] "Generic (PLEG): container finished" podID="d4a3a690-c4c6-47cd-838d-818956791ab5" containerID="eeaa29b9b7051ab78bb1897fed9cfe53b3d65b60298d3f992e6e72c50990b62c" exitCode=0 Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.394512 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dh86b" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.394514 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh86b" event={"ID":"d4a3a690-c4c6-47cd-838d-818956791ab5","Type":"ContainerDied","Data":"eeaa29b9b7051ab78bb1897fed9cfe53b3d65b60298d3f992e6e72c50990b62c"} Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.394574 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dh86b" event={"ID":"d4a3a690-c4c6-47cd-838d-818956791ab5","Type":"ContainerDied","Data":"4eb179634afbdb24086619c42f14069d55879f6f3823f0d68bb955ce6dfcc4a5"} Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.394595 4781 scope.go:117] "RemoveContainer" containerID="eeaa29b9b7051ab78bb1897fed9cfe53b3d65b60298d3f992e6e72c50990b62c" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.411862 4781 scope.go:117] "RemoveContainer" containerID="38e59d5338a3e2ff4d349c5bc3a55d8519cbcd58da6a103fedd38e4a3d49e8d8" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.432612 4781 scope.go:117] "RemoveContainer" containerID="bfe11b1622985f84aea6af355fb76dede023e6af291ff407e6144d9d4e14a563" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.441808 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9wlg\" (UniqueName: \"kubernetes.io/projected/d4a3a690-c4c6-47cd-838d-818956791ab5-kube-api-access-z9wlg\") pod \"d4a3a690-c4c6-47cd-838d-818956791ab5\" (UID: \"d4a3a690-c4c6-47cd-838d-818956791ab5\") " Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.441891 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a3a690-c4c6-47cd-838d-818956791ab5-catalog-content\") pod \"d4a3a690-c4c6-47cd-838d-818956791ab5\" (UID: \"d4a3a690-c4c6-47cd-838d-818956791ab5\") " Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.442058 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a3a690-c4c6-47cd-838d-818956791ab5-utilities\") pod \"d4a3a690-c4c6-47cd-838d-818956791ab5\" (UID: \"d4a3a690-c4c6-47cd-838d-818956791ab5\") " Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.443482 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a3a690-c4c6-47cd-838d-818956791ab5-utilities" (OuterVolumeSpecName: "utilities") pod "d4a3a690-c4c6-47cd-838d-818956791ab5" (UID: "d4a3a690-c4c6-47cd-838d-818956791ab5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.449784 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a3a690-c4c6-47cd-838d-818956791ab5-kube-api-access-z9wlg" (OuterVolumeSpecName: "kube-api-access-z9wlg") pod "d4a3a690-c4c6-47cd-838d-818956791ab5" (UID: "d4a3a690-c4c6-47cd-838d-818956791ab5"). InnerVolumeSpecName "kube-api-access-z9wlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.454233 4781 scope.go:117] "RemoveContainer" containerID="eeaa29b9b7051ab78bb1897fed9cfe53b3d65b60298d3f992e6e72c50990b62c" Dec 08 20:20:56 crc kubenswrapper[4781]: E1208 20:20:56.455492 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeaa29b9b7051ab78bb1897fed9cfe53b3d65b60298d3f992e6e72c50990b62c\": container with ID starting with eeaa29b9b7051ab78bb1897fed9cfe53b3d65b60298d3f992e6e72c50990b62c not found: ID does not exist" containerID="eeaa29b9b7051ab78bb1897fed9cfe53b3d65b60298d3f992e6e72c50990b62c" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.455552 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeaa29b9b7051ab78bb1897fed9cfe53b3d65b60298d3f992e6e72c50990b62c"} err="failed to get container status \"eeaa29b9b7051ab78bb1897fed9cfe53b3d65b60298d3f992e6e72c50990b62c\": rpc error: code = NotFound desc = could not find container \"eeaa29b9b7051ab78bb1897fed9cfe53b3d65b60298d3f992e6e72c50990b62c\": container with ID starting with eeaa29b9b7051ab78bb1897fed9cfe53b3d65b60298d3f992e6e72c50990b62c not found: ID does not exist" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.455596 4781 scope.go:117] "RemoveContainer" containerID="38e59d5338a3e2ff4d349c5bc3a55d8519cbcd58da6a103fedd38e4a3d49e8d8" Dec 08 20:20:56 crc kubenswrapper[4781]: E1208 20:20:56.456290 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38e59d5338a3e2ff4d349c5bc3a55d8519cbcd58da6a103fedd38e4a3d49e8d8\": container with ID starting with 38e59d5338a3e2ff4d349c5bc3a55d8519cbcd58da6a103fedd38e4a3d49e8d8 not found: ID does not exist" containerID="38e59d5338a3e2ff4d349c5bc3a55d8519cbcd58da6a103fedd38e4a3d49e8d8" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.456335 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38e59d5338a3e2ff4d349c5bc3a55d8519cbcd58da6a103fedd38e4a3d49e8d8"} err="failed to get container status \"38e59d5338a3e2ff4d349c5bc3a55d8519cbcd58da6a103fedd38e4a3d49e8d8\": rpc error: code = NotFound desc = could not find container \"38e59d5338a3e2ff4d349c5bc3a55d8519cbcd58da6a103fedd38e4a3d49e8d8\": container with ID starting with 38e59d5338a3e2ff4d349c5bc3a55d8519cbcd58da6a103fedd38e4a3d49e8d8 not found: ID does not exist" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.456363 4781 scope.go:117] "RemoveContainer" containerID="bfe11b1622985f84aea6af355fb76dede023e6af291ff407e6144d9d4e14a563" Dec 08 20:20:56 crc kubenswrapper[4781]: E1208 20:20:56.456977 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfe11b1622985f84aea6af355fb76dede023e6af291ff407e6144d9d4e14a563\": container with ID starting with bfe11b1622985f84aea6af355fb76dede023e6af291ff407e6144d9d4e14a563 not found: ID does not exist" containerID="bfe11b1622985f84aea6af355fb76dede023e6af291ff407e6144d9d4e14a563" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.457041 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe11b1622985f84aea6af355fb76dede023e6af291ff407e6144d9d4e14a563"} err="failed to get container status \"bfe11b1622985f84aea6af355fb76dede023e6af291ff407e6144d9d4e14a563\": rpc error: code = NotFound desc = could not find container \"bfe11b1622985f84aea6af355fb76dede023e6af291ff407e6144d9d4e14a563\": container with ID starting with bfe11b1622985f84aea6af355fb76dede023e6af291ff407e6144d9d4e14a563 not found: ID does not exist" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.488745 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4a3a690-c4c6-47cd-838d-818956791ab5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4a3a690-c4c6-47cd-838d-818956791ab5" (UID: "d4a3a690-c4c6-47cd-838d-818956791ab5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.543657 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4a3a690-c4c6-47cd-838d-818956791ab5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.543695 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4a3a690-c4c6-47cd-838d-818956791ab5-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.543739 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9wlg\" (UniqueName: \"kubernetes.io/projected/d4a3a690-c4c6-47cd-838d-818956791ab5-kube-api-access-z9wlg\") on node \"crc\" DevicePath \"\"" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.741624 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dh86b"] Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.746805 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dh86b"] Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.805306 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.805385 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:56 crc kubenswrapper[4781]: E1208 20:20:56.824547 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4a3a690_c4c6_47cd_838d_818956791ab5.slice/crio-4eb179634afbdb24086619c42f14069d55879f6f3823f0d68bb955ce6dfcc4a5\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4a3a690_c4c6_47cd_838d_818956791ab5.slice\": RecentStats: unable to find data in memory cache]" Dec 08 20:20:56 crc kubenswrapper[4781]: I1208 20:20:56.858233 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:57 crc kubenswrapper[4781]: I1208 20:20:57.446302 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:20:57 crc kubenswrapper[4781]: I1208 20:20:57.946888 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b2dv"] Dec 08 20:20:57 crc kubenswrapper[4781]: I1208 20:20:57.947380 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5b2dv" podUID="6111865f-840d-425c-8f1b-257dcf640da5" containerName="registry-server" containerID="cri-o://0b43db88e69abd01a3ad492afac1e15d776625dc6dc59e32eedf767c0f55c52f" gracePeriod=2 Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.182383 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4a3a690-c4c6-47cd-838d-818956791ab5" path="/var/lib/kubelet/pods/d4a3a690-c4c6-47cd-838d-818956791ab5/volumes" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.371098 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.412393 4781 generic.go:334] "Generic (PLEG): container finished" podID="6111865f-840d-425c-8f1b-257dcf640da5" containerID="0b43db88e69abd01a3ad492afac1e15d776625dc6dc59e32eedf767c0f55c52f" exitCode=0 Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.412448 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5b2dv" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.412456 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b2dv" event={"ID":"6111865f-840d-425c-8f1b-257dcf640da5","Type":"ContainerDied","Data":"0b43db88e69abd01a3ad492afac1e15d776625dc6dc59e32eedf767c0f55c52f"} Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.412484 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b2dv" event={"ID":"6111865f-840d-425c-8f1b-257dcf640da5","Type":"ContainerDied","Data":"d90e066f6335e4991efd26fd5204286d7aa554aac55fa66e0d5c84ac4ebafe8f"} Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.412500 4781 scope.go:117] "RemoveContainer" containerID="0b43db88e69abd01a3ad492afac1e15d776625dc6dc59e32eedf767c0f55c52f" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.415274 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" event={"ID":"0bafa66c-7cf8-40eb-ae15-a4365fbe3176","Type":"ContainerStarted","Data":"9dcd1323471fc2e8d7ce74c4ea75a1a57b7bb74bab1a5f87c5c29e79d6bf1899"} Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.415634 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.433138 4781 scope.go:117] "RemoveContainer" containerID="020fd5f3b68c9caf8adb1b4ce22e1aaca0831e138073f4a586ea4d4edfa31ac8" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.454097 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" podStartSLOduration=35.370327922 podStartE2EDuration="1m5.454077476s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:20:27.448274767 +0000 UTC m=+943.599558154" lastFinishedPulling="2025-12-08 20:20:57.532024331 +0000 UTC m=+973.683307708" observedRunningTime="2025-12-08 20:20:58.442651257 +0000 UTC m=+974.593934654" watchObservedRunningTime="2025-12-08 20:20:58.454077476 +0000 UTC m=+974.605360853" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.454585 4781 scope.go:117] "RemoveContainer" containerID="17e07d77a7c656e27c98991e1a0b82d2498698fa5eefdb9d0be3c259f6cc318c" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.480264 4781 scope.go:117] "RemoveContainer" containerID="0b43db88e69abd01a3ad492afac1e15d776625dc6dc59e32eedf767c0f55c52f" Dec 08 20:20:58 crc kubenswrapper[4781]: E1208 20:20:58.480729 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b43db88e69abd01a3ad492afac1e15d776625dc6dc59e32eedf767c0f55c52f\": container with ID starting with 0b43db88e69abd01a3ad492afac1e15d776625dc6dc59e32eedf767c0f55c52f not found: ID does not exist" containerID="0b43db88e69abd01a3ad492afac1e15d776625dc6dc59e32eedf767c0f55c52f" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.480758 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b43db88e69abd01a3ad492afac1e15d776625dc6dc59e32eedf767c0f55c52f"} err="failed to get container status \"0b43db88e69abd01a3ad492afac1e15d776625dc6dc59e32eedf767c0f55c52f\": rpc error: code = NotFound desc = could not find container \"0b43db88e69abd01a3ad492afac1e15d776625dc6dc59e32eedf767c0f55c52f\": container with ID starting with 0b43db88e69abd01a3ad492afac1e15d776625dc6dc59e32eedf767c0f55c52f not found: ID does not exist" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.480778 4781 scope.go:117] "RemoveContainer" containerID="020fd5f3b68c9caf8adb1b4ce22e1aaca0831e138073f4a586ea4d4edfa31ac8" Dec 08 20:20:58 crc kubenswrapper[4781]: E1208 20:20:58.481683 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"020fd5f3b68c9caf8adb1b4ce22e1aaca0831e138073f4a586ea4d4edfa31ac8\": container with ID starting with 020fd5f3b68c9caf8adb1b4ce22e1aaca0831e138073f4a586ea4d4edfa31ac8 not found: ID does not exist" containerID="020fd5f3b68c9caf8adb1b4ce22e1aaca0831e138073f4a586ea4d4edfa31ac8" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.481750 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"020fd5f3b68c9caf8adb1b4ce22e1aaca0831e138073f4a586ea4d4edfa31ac8"} err="failed to get container status \"020fd5f3b68c9caf8adb1b4ce22e1aaca0831e138073f4a586ea4d4edfa31ac8\": rpc error: code = NotFound desc = could not find container \"020fd5f3b68c9caf8adb1b4ce22e1aaca0831e138073f4a586ea4d4edfa31ac8\": container with ID starting with 020fd5f3b68c9caf8adb1b4ce22e1aaca0831e138073f4a586ea4d4edfa31ac8 not found: ID does not exist" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.481776 4781 scope.go:117] "RemoveContainer" containerID="17e07d77a7c656e27c98991e1a0b82d2498698fa5eefdb9d0be3c259f6cc318c" Dec 08 20:20:58 crc kubenswrapper[4781]: E1208 20:20:58.482105 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e07d77a7c656e27c98991e1a0b82d2498698fa5eefdb9d0be3c259f6cc318c\": container with ID starting with 17e07d77a7c656e27c98991e1a0b82d2498698fa5eefdb9d0be3c259f6cc318c not found: ID does not exist" containerID="17e07d77a7c656e27c98991e1a0b82d2498698fa5eefdb9d0be3c259f6cc318c" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.482133 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e07d77a7c656e27c98991e1a0b82d2498698fa5eefdb9d0be3c259f6cc318c"} err="failed to get container status \"17e07d77a7c656e27c98991e1a0b82d2498698fa5eefdb9d0be3c259f6cc318c\": rpc error: code = NotFound desc = could not find container \"17e07d77a7c656e27c98991e1a0b82d2498698fa5eefdb9d0be3c259f6cc318c\": container with ID starting with 17e07d77a7c656e27c98991e1a0b82d2498698fa5eefdb9d0be3c259f6cc318c not found: ID does not exist" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.483497 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6111865f-840d-425c-8f1b-257dcf640da5-utilities\") pod \"6111865f-840d-425c-8f1b-257dcf640da5\" (UID: \"6111865f-840d-425c-8f1b-257dcf640da5\") " Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.483585 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6111865f-840d-425c-8f1b-257dcf640da5-catalog-content\") pod \"6111865f-840d-425c-8f1b-257dcf640da5\" (UID: \"6111865f-840d-425c-8f1b-257dcf640da5\") " Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.484155 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gz4t\" (UniqueName: \"kubernetes.io/projected/6111865f-840d-425c-8f1b-257dcf640da5-kube-api-access-8gz4t\") pod \"6111865f-840d-425c-8f1b-257dcf640da5\" (UID: \"6111865f-840d-425c-8f1b-257dcf640da5\") " Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.484690 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6111865f-840d-425c-8f1b-257dcf640da5-utilities" (OuterVolumeSpecName: "utilities") pod "6111865f-840d-425c-8f1b-257dcf640da5" (UID: "6111865f-840d-425c-8f1b-257dcf640da5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.484862 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6111865f-840d-425c-8f1b-257dcf640da5-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.488554 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6111865f-840d-425c-8f1b-257dcf640da5-kube-api-access-8gz4t" (OuterVolumeSpecName: "kube-api-access-8gz4t") pod "6111865f-840d-425c-8f1b-257dcf640da5" (UID: "6111865f-840d-425c-8f1b-257dcf640da5"). InnerVolumeSpecName "kube-api-access-8gz4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.502671 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6111865f-840d-425c-8f1b-257dcf640da5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6111865f-840d-425c-8f1b-257dcf640da5" (UID: "6111865f-840d-425c-8f1b-257dcf640da5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.585648 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6111865f-840d-425c-8f1b-257dcf640da5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.585681 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gz4t\" (UniqueName: \"kubernetes.io/projected/6111865f-840d-425c-8f1b-257dcf640da5-kube-api-access-8gz4t\") on node \"crc\" DevicePath \"\"" Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.741883 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b2dv"] Dec 08 20:20:58 crc kubenswrapper[4781]: I1208 20:20:58.748076 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b2dv"] Dec 08 20:21:00 crc kubenswrapper[4781]: I1208 20:21:00.135479 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6111865f-840d-425c-8f1b-257dcf640da5" path="/var/lib/kubelet/pods/6111865f-840d-425c-8f1b-257dcf640da5/volumes" Dec 08 20:21:00 crc kubenswrapper[4781]: I1208 20:21:00.341056 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcc4x"] Dec 08 20:21:00 crc kubenswrapper[4781]: I1208 20:21:00.341502 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bcc4x" podUID="15f5fcf6-2249-49dc-a4a1-18b746f370d0" containerName="registry-server" containerID="cri-o://2923acdbe8d2af5de89f57da7fec7be055e57bb8055936c1dc6e28986920d999" gracePeriod=2 Dec 08 20:21:00 crc kubenswrapper[4781]: I1208 20:21:00.744846 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:21:00 crc kubenswrapper[4781]: I1208 20:21:00.918003 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f5fcf6-2249-49dc-a4a1-18b746f370d0-utilities\") pod \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\" (UID: \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\") " Dec 08 20:21:00 crc kubenswrapper[4781]: I1208 20:21:00.918062 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n964n\" (UniqueName: \"kubernetes.io/projected/15f5fcf6-2249-49dc-a4a1-18b746f370d0-kube-api-access-n964n\") pod \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\" (UID: \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\") " Dec 08 20:21:00 crc kubenswrapper[4781]: I1208 20:21:00.918110 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f5fcf6-2249-49dc-a4a1-18b746f370d0-catalog-content\") pod \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\" (UID: \"15f5fcf6-2249-49dc-a4a1-18b746f370d0\") " Dec 08 20:21:00 crc kubenswrapper[4781]: I1208 20:21:00.918883 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f5fcf6-2249-49dc-a4a1-18b746f370d0-utilities" (OuterVolumeSpecName: "utilities") pod "15f5fcf6-2249-49dc-a4a1-18b746f370d0" (UID: "15f5fcf6-2249-49dc-a4a1-18b746f370d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:21:00 crc kubenswrapper[4781]: I1208 20:21:00.922246 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f5fcf6-2249-49dc-a4a1-18b746f370d0-kube-api-access-n964n" (OuterVolumeSpecName: "kube-api-access-n964n") pod "15f5fcf6-2249-49dc-a4a1-18b746f370d0" (UID: "15f5fcf6-2249-49dc-a4a1-18b746f370d0"). InnerVolumeSpecName "kube-api-access-n964n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:21:00 crc kubenswrapper[4781]: I1208 20:21:00.965411 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f5fcf6-2249-49dc-a4a1-18b746f370d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15f5fcf6-2249-49dc-a4a1-18b746f370d0" (UID: "15f5fcf6-2249-49dc-a4a1-18b746f370d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.019388 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f5fcf6-2249-49dc-a4a1-18b746f370d0-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.019437 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n964n\" (UniqueName: \"kubernetes.io/projected/15f5fcf6-2249-49dc-a4a1-18b746f370d0-kube-api-access-n964n\") on node \"crc\" DevicePath \"\"" Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.019449 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f5fcf6-2249-49dc-a4a1-18b746f370d0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.438614 4781 generic.go:334] "Generic (PLEG): container finished" podID="15f5fcf6-2249-49dc-a4a1-18b746f370d0" containerID="2923acdbe8d2af5de89f57da7fec7be055e57bb8055936c1dc6e28986920d999" exitCode=0 Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.438677 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcc4x" Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.438696 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcc4x" event={"ID":"15f5fcf6-2249-49dc-a4a1-18b746f370d0","Type":"ContainerDied","Data":"2923acdbe8d2af5de89f57da7fec7be055e57bb8055936c1dc6e28986920d999"} Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.439022 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcc4x" event={"ID":"15f5fcf6-2249-49dc-a4a1-18b746f370d0","Type":"ContainerDied","Data":"adb09ad8b41cae80524c02b624ec18f4b502c66fd4ecb4960863d27e0e6d4da6"} Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.439042 4781 scope.go:117] "RemoveContainer" containerID="2923acdbe8d2af5de89f57da7fec7be055e57bb8055936c1dc6e28986920d999" Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.460212 4781 scope.go:117] "RemoveContainer" containerID="08ec59328ffd24dc1709ce9ac49f39e59434b342b05b3fa9b9ca83ecab7c7eac" Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.483208 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcc4x"] Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.486073 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bcc4x"] Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.492005 4781 scope.go:117] "RemoveContainer" containerID="003807ceeae087d8dcf2baf2accb30e31635ff19a8d891c20f2a8797a6bd527e" Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.507768 4781 scope.go:117] "RemoveContainer" containerID="2923acdbe8d2af5de89f57da7fec7be055e57bb8055936c1dc6e28986920d999" Dec 08 20:21:01 crc kubenswrapper[4781]: E1208 20:21:01.508368 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2923acdbe8d2af5de89f57da7fec7be055e57bb8055936c1dc6e28986920d999\": container with ID starting with 2923acdbe8d2af5de89f57da7fec7be055e57bb8055936c1dc6e28986920d999 not found: ID does not exist" containerID="2923acdbe8d2af5de89f57da7fec7be055e57bb8055936c1dc6e28986920d999" Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.508448 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2923acdbe8d2af5de89f57da7fec7be055e57bb8055936c1dc6e28986920d999"} err="failed to get container status \"2923acdbe8d2af5de89f57da7fec7be055e57bb8055936c1dc6e28986920d999\": rpc error: code = NotFound desc = could not find container \"2923acdbe8d2af5de89f57da7fec7be055e57bb8055936c1dc6e28986920d999\": container with ID starting with 2923acdbe8d2af5de89f57da7fec7be055e57bb8055936c1dc6e28986920d999 not found: ID does not exist" Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.508477 4781 scope.go:117] "RemoveContainer" containerID="08ec59328ffd24dc1709ce9ac49f39e59434b342b05b3fa9b9ca83ecab7c7eac" Dec 08 20:21:01 crc kubenswrapper[4781]: E1208 20:21:01.509062 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ec59328ffd24dc1709ce9ac49f39e59434b342b05b3fa9b9ca83ecab7c7eac\": container with ID starting with 08ec59328ffd24dc1709ce9ac49f39e59434b342b05b3fa9b9ca83ecab7c7eac not found: ID does not exist" containerID="08ec59328ffd24dc1709ce9ac49f39e59434b342b05b3fa9b9ca83ecab7c7eac" Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.509102 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ec59328ffd24dc1709ce9ac49f39e59434b342b05b3fa9b9ca83ecab7c7eac"} err="failed to get container status \"08ec59328ffd24dc1709ce9ac49f39e59434b342b05b3fa9b9ca83ecab7c7eac\": rpc error: code = NotFound desc = could not find container \"08ec59328ffd24dc1709ce9ac49f39e59434b342b05b3fa9b9ca83ecab7c7eac\": container with ID starting with 08ec59328ffd24dc1709ce9ac49f39e59434b342b05b3fa9b9ca83ecab7c7eac not found: ID does not exist" Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.509127 4781 scope.go:117] "RemoveContainer" containerID="003807ceeae087d8dcf2baf2accb30e31635ff19a8d891c20f2a8797a6bd527e" Dec 08 20:21:01 crc kubenswrapper[4781]: E1208 20:21:01.509449 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"003807ceeae087d8dcf2baf2accb30e31635ff19a8d891c20f2a8797a6bd527e\": container with ID starting with 003807ceeae087d8dcf2baf2accb30e31635ff19a8d891c20f2a8797a6bd527e not found: ID does not exist" containerID="003807ceeae087d8dcf2baf2accb30e31635ff19a8d891c20f2a8797a6bd527e" Dec 08 20:21:01 crc kubenswrapper[4781]: I1208 20:21:01.509535 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"003807ceeae087d8dcf2baf2accb30e31635ff19a8d891c20f2a8797a6bd527e"} err="failed to get container status \"003807ceeae087d8dcf2baf2accb30e31635ff19a8d891c20f2a8797a6bd527e\": rpc error: code = NotFound desc = could not find container \"003807ceeae087d8dcf2baf2accb30e31635ff19a8d891c20f2a8797a6bd527e\": container with ID starting with 003807ceeae087d8dcf2baf2accb30e31635ff19a8d891c20f2a8797a6bd527e not found: ID does not exist" Dec 08 20:21:02 crc kubenswrapper[4781]: I1208 20:21:02.136588 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15f5fcf6-2249-49dc-a4a1-18b746f370d0" path="/var/lib/kubelet/pods/15f5fcf6-2249-49dc-a4a1-18b746f370d0/volumes" Dec 08 20:21:02 crc kubenswrapper[4781]: I1208 20:21:02.451374 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" event={"ID":"47e01596-c50b-44f5-82fb-1b6c7a005d10","Type":"ContainerStarted","Data":"16e2a85c7b784021cc08c21e5abd2c02ea96aa8a5a389d86fe2b283a7e896a15"} Dec 08 20:21:02 crc kubenswrapper[4781]: I1208 20:21:02.451689 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:21:02 crc kubenswrapper[4781]: I1208 20:21:02.478268 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" podStartSLOduration=35.247399811 podStartE2EDuration="1m9.478249394s" podCreationTimestamp="2025-12-08 20:19:53 +0000 UTC" firstStartedPulling="2025-12-08 20:20:27.316621654 +0000 UTC m=+943.467905031" lastFinishedPulling="2025-12-08 20:21:01.547471237 +0000 UTC m=+977.698754614" observedRunningTime="2025-12-08 20:21:02.472807238 +0000 UTC m=+978.624090615" watchObservedRunningTime="2025-12-08 20:21:02.478249394 +0000 UTC m=+978.629532771" Dec 08 20:21:05 crc kubenswrapper[4781]: I1208 20:21:05.735399 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-744f8cb766hj2j5" Dec 08 20:21:15 crc kubenswrapper[4781]: I1208 20:21:15.416116 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-78d48bff9d-5mbrr" Dec 08 20:21:29 crc kubenswrapper[4781]: I1208 20:21:29.948180 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:21:29 crc kubenswrapper[4781]: I1208 20:21:29.948872 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.622973 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-fv7pt"] Dec 08 20:21:31 crc kubenswrapper[4781]: E1208 20:21:31.623583 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f5fcf6-2249-49dc-a4a1-18b746f370d0" containerName="extract-content" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.623599 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f5fcf6-2249-49dc-a4a1-18b746f370d0" containerName="extract-content" Dec 08 20:21:31 crc kubenswrapper[4781]: E1208 20:21:31.623621 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a3a690-c4c6-47cd-838d-818956791ab5" containerName="extract-content" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.623630 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a3a690-c4c6-47cd-838d-818956791ab5" containerName="extract-content" Dec 08 20:21:31 crc kubenswrapper[4781]: E1208 20:21:31.623651 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a3a690-c4c6-47cd-838d-818956791ab5" containerName="registry-server" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.623659 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a3a690-c4c6-47cd-838d-818956791ab5" containerName="registry-server" Dec 08 20:21:31 crc kubenswrapper[4781]: E1208 20:21:31.623676 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6111865f-840d-425c-8f1b-257dcf640da5" containerName="extract-content" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.623683 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6111865f-840d-425c-8f1b-257dcf640da5" containerName="extract-content" Dec 08 20:21:31 crc kubenswrapper[4781]: E1208 20:21:31.623700 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f5fcf6-2249-49dc-a4a1-18b746f370d0" containerName="registry-server" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.623707 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f5fcf6-2249-49dc-a4a1-18b746f370d0" containerName="registry-server" Dec 08 20:21:31 crc kubenswrapper[4781]: E1208 20:21:31.623725 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6111865f-840d-425c-8f1b-257dcf640da5" containerName="registry-server" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.623732 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6111865f-840d-425c-8f1b-257dcf640da5" containerName="registry-server" Dec 08 20:21:31 crc kubenswrapper[4781]: E1208 20:21:31.623930 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f5fcf6-2249-49dc-a4a1-18b746f370d0" containerName="extract-utilities" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.623940 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f5fcf6-2249-49dc-a4a1-18b746f370d0" containerName="extract-utilities" Dec 08 20:21:31 crc kubenswrapper[4781]: E1208 20:21:31.623955 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6111865f-840d-425c-8f1b-257dcf640da5" containerName="extract-utilities" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.623961 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6111865f-840d-425c-8f1b-257dcf640da5" containerName="extract-utilities" Dec 08 20:21:31 crc kubenswrapper[4781]: E1208 20:21:31.623979 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a3a690-c4c6-47cd-838d-818956791ab5" containerName="extract-utilities" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.623987 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a3a690-c4c6-47cd-838d-818956791ab5" containerName="extract-utilities" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.624160 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a3a690-c4c6-47cd-838d-818956791ab5" containerName="registry-server" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.624183 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f5fcf6-2249-49dc-a4a1-18b746f370d0" containerName="registry-server" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.624200 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6111865f-840d-425c-8f1b-257dcf640da5" containerName="registry-server" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.625045 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-fv7pt" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.628334 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.628814 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.628993 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-67b6f" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.629002 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.638500 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-fv7pt"] Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.666019 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-567c455747-jb8jk"] Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.667389 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-jb8jk" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.671748 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.678476 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-jb8jk"] Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.716966 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6z2l\" (UniqueName: \"kubernetes.io/projected/934f62c5-5393-4fbc-919b-aa66c1764fde-kube-api-access-z6z2l\") pod \"dnsmasq-dns-5cd484bb89-fv7pt\" (UID: \"934f62c5-5393-4fbc-919b-aa66c1764fde\") " pod="openstack/dnsmasq-dns-5cd484bb89-fv7pt" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.717023 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d6cf391-8ec5-42dc-9618-89d928d6a85c-dns-svc\") pod \"dnsmasq-dns-567c455747-jb8jk\" (UID: \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\") " pod="openstack/dnsmasq-dns-567c455747-jb8jk" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.717063 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6cf391-8ec5-42dc-9618-89d928d6a85c-config\") pod \"dnsmasq-dns-567c455747-jb8jk\" (UID: \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\") " pod="openstack/dnsmasq-dns-567c455747-jb8jk" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.717211 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x27lh\" (UniqueName: \"kubernetes.io/projected/8d6cf391-8ec5-42dc-9618-89d928d6a85c-kube-api-access-x27lh\") pod \"dnsmasq-dns-567c455747-jb8jk\" (UID: \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\") " pod="openstack/dnsmasq-dns-567c455747-jb8jk" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.717367 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934f62c5-5393-4fbc-919b-aa66c1764fde-config\") pod \"dnsmasq-dns-5cd484bb89-fv7pt\" (UID: \"934f62c5-5393-4fbc-919b-aa66c1764fde\") " pod="openstack/dnsmasq-dns-5cd484bb89-fv7pt" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.818494 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x27lh\" (UniqueName: \"kubernetes.io/projected/8d6cf391-8ec5-42dc-9618-89d928d6a85c-kube-api-access-x27lh\") pod \"dnsmasq-dns-567c455747-jb8jk\" (UID: \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\") " pod="openstack/dnsmasq-dns-567c455747-jb8jk" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.818606 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934f62c5-5393-4fbc-919b-aa66c1764fde-config\") pod \"dnsmasq-dns-5cd484bb89-fv7pt\" (UID: \"934f62c5-5393-4fbc-919b-aa66c1764fde\") " pod="openstack/dnsmasq-dns-5cd484bb89-fv7pt" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.818658 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6z2l\" (UniqueName: \"kubernetes.io/projected/934f62c5-5393-4fbc-919b-aa66c1764fde-kube-api-access-z6z2l\") pod \"dnsmasq-dns-5cd484bb89-fv7pt\" (UID: \"934f62c5-5393-4fbc-919b-aa66c1764fde\") " pod="openstack/dnsmasq-dns-5cd484bb89-fv7pt" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.818691 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d6cf391-8ec5-42dc-9618-89d928d6a85c-dns-svc\") pod \"dnsmasq-dns-567c455747-jb8jk\" (UID: \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\") " pod="openstack/dnsmasq-dns-567c455747-jb8jk" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.818711 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6cf391-8ec5-42dc-9618-89d928d6a85c-config\") pod \"dnsmasq-dns-567c455747-jb8jk\" (UID: \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\") " pod="openstack/dnsmasq-dns-567c455747-jb8jk" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.819709 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6cf391-8ec5-42dc-9618-89d928d6a85c-config\") pod \"dnsmasq-dns-567c455747-jb8jk\" (UID: \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\") " pod="openstack/dnsmasq-dns-567c455747-jb8jk" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.819730 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934f62c5-5393-4fbc-919b-aa66c1764fde-config\") pod \"dnsmasq-dns-5cd484bb89-fv7pt\" (UID: \"934f62c5-5393-4fbc-919b-aa66c1764fde\") " pod="openstack/dnsmasq-dns-5cd484bb89-fv7pt" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.820212 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d6cf391-8ec5-42dc-9618-89d928d6a85c-dns-svc\") pod \"dnsmasq-dns-567c455747-jb8jk\" (UID: \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\") " pod="openstack/dnsmasq-dns-567c455747-jb8jk" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.838639 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6z2l\" (UniqueName: \"kubernetes.io/projected/934f62c5-5393-4fbc-919b-aa66c1764fde-kube-api-access-z6z2l\") pod \"dnsmasq-dns-5cd484bb89-fv7pt\" (UID: \"934f62c5-5393-4fbc-919b-aa66c1764fde\") " pod="openstack/dnsmasq-dns-5cd484bb89-fv7pt" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.839021 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x27lh\" (UniqueName: \"kubernetes.io/projected/8d6cf391-8ec5-42dc-9618-89d928d6a85c-kube-api-access-x27lh\") pod \"dnsmasq-dns-567c455747-jb8jk\" (UID: \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\") " pod="openstack/dnsmasq-dns-567c455747-jb8jk" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.945471 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-fv7pt" Dec 08 20:21:31 crc kubenswrapper[4781]: I1208 20:21:31.983264 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-jb8jk" Dec 08 20:21:32 crc kubenswrapper[4781]: I1208 20:21:32.401892 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-fv7pt"] Dec 08 20:21:32 crc kubenswrapper[4781]: I1208 20:21:32.459716 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-jb8jk"] Dec 08 20:21:32 crc kubenswrapper[4781]: W1208 20:21:32.462617 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d6cf391_8ec5_42dc_9618_89d928d6a85c.slice/crio-c9ab1cc443b50a1ea68d1e8e941585a337de71f655a753d17a9e92e983a97154 WatchSource:0}: Error finding container c9ab1cc443b50a1ea68d1e8e941585a337de71f655a753d17a9e92e983a97154: Status 404 returned error can't find the container with id c9ab1cc443b50a1ea68d1e8e941585a337de71f655a753d17a9e92e983a97154 Dec 08 20:21:32 crc kubenswrapper[4781]: I1208 20:21:32.686990 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-fv7pt" event={"ID":"934f62c5-5393-4fbc-919b-aa66c1764fde","Type":"ContainerStarted","Data":"124d9ae748add665e2e095b48f9356055f530463ffab226c9423fb8a2ac9f263"} Dec 08 20:21:32 crc kubenswrapper[4781]: I1208 20:21:32.689345 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-jb8jk" event={"ID":"8d6cf391-8ec5-42dc-9618-89d928d6a85c","Type":"ContainerStarted","Data":"c9ab1cc443b50a1ea68d1e8e941585a337de71f655a753d17a9e92e983a97154"} Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.550695 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-jb8jk"] Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.594134 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-4qvh7"] Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.600780 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.621128 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-4qvh7"] Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.660746 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z54hz\" (UniqueName: \"kubernetes.io/projected/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-kube-api-access-z54hz\") pod \"dnsmasq-dns-bc4b48fc9-4qvh7\" (UID: \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\") " pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.660844 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-4qvh7\" (UID: \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\") " pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.660906 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-config\") pod \"dnsmasq-dns-bc4b48fc9-4qvh7\" (UID: \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\") " pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.762371 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-4qvh7\" (UID: \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\") " pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.762483 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-config\") pod \"dnsmasq-dns-bc4b48fc9-4qvh7\" (UID: \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\") " pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.762514 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z54hz\" (UniqueName: \"kubernetes.io/projected/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-kube-api-access-z54hz\") pod \"dnsmasq-dns-bc4b48fc9-4qvh7\" (UID: \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\") " pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.764180 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-4qvh7\" (UID: \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\") " pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.764680 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-config\") pod \"dnsmasq-dns-bc4b48fc9-4qvh7\" (UID: \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\") " pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.789222 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z54hz\" (UniqueName: \"kubernetes.io/projected/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-kube-api-access-z54hz\") pod \"dnsmasq-dns-bc4b48fc9-4qvh7\" (UID: \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\") " pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.870542 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-fv7pt"] Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.898572 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb666b895-vwrcm"] Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.905335 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-vwrcm" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.918588 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-vwrcm"] Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.935843 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.965008 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1453615c-ee69-4362-87c3-844a3bb16faf-config\") pod \"dnsmasq-dns-cb666b895-vwrcm\" (UID: \"1453615c-ee69-4362-87c3-844a3bb16faf\") " pod="openstack/dnsmasq-dns-cb666b895-vwrcm" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.965074 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtbcv\" (UniqueName: \"kubernetes.io/projected/1453615c-ee69-4362-87c3-844a3bb16faf-kube-api-access-xtbcv\") pod \"dnsmasq-dns-cb666b895-vwrcm\" (UID: \"1453615c-ee69-4362-87c3-844a3bb16faf\") " pod="openstack/dnsmasq-dns-cb666b895-vwrcm" Dec 08 20:21:34 crc kubenswrapper[4781]: I1208 20:21:34.965138 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1453615c-ee69-4362-87c3-844a3bb16faf-dns-svc\") pod \"dnsmasq-dns-cb666b895-vwrcm\" (UID: \"1453615c-ee69-4362-87c3-844a3bb16faf\") " pod="openstack/dnsmasq-dns-cb666b895-vwrcm" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.066936 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1453615c-ee69-4362-87c3-844a3bb16faf-config\") pod \"dnsmasq-dns-cb666b895-vwrcm\" (UID: \"1453615c-ee69-4362-87c3-844a3bb16faf\") " pod="openstack/dnsmasq-dns-cb666b895-vwrcm" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.066991 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtbcv\" (UniqueName: \"kubernetes.io/projected/1453615c-ee69-4362-87c3-844a3bb16faf-kube-api-access-xtbcv\") pod \"dnsmasq-dns-cb666b895-vwrcm\" (UID: \"1453615c-ee69-4362-87c3-844a3bb16faf\") " pod="openstack/dnsmasq-dns-cb666b895-vwrcm" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.067101 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1453615c-ee69-4362-87c3-844a3bb16faf-dns-svc\") pod \"dnsmasq-dns-cb666b895-vwrcm\" (UID: \"1453615c-ee69-4362-87c3-844a3bb16faf\") " pod="openstack/dnsmasq-dns-cb666b895-vwrcm" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.070328 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1453615c-ee69-4362-87c3-844a3bb16faf-dns-svc\") pod \"dnsmasq-dns-cb666b895-vwrcm\" (UID: \"1453615c-ee69-4362-87c3-844a3bb16faf\") " pod="openstack/dnsmasq-dns-cb666b895-vwrcm" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.070570 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1453615c-ee69-4362-87c3-844a3bb16faf-config\") pod \"dnsmasq-dns-cb666b895-vwrcm\" (UID: \"1453615c-ee69-4362-87c3-844a3bb16faf\") " pod="openstack/dnsmasq-dns-cb666b895-vwrcm" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.104545 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtbcv\" (UniqueName: \"kubernetes.io/projected/1453615c-ee69-4362-87c3-844a3bb16faf-kube-api-access-xtbcv\") pod \"dnsmasq-dns-cb666b895-vwrcm\" (UID: \"1453615c-ee69-4362-87c3-844a3bb16faf\") " pod="openstack/dnsmasq-dns-cb666b895-vwrcm" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.232239 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-vwrcm" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.489976 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-4qvh7"] Dec 08 20:21:35 crc kubenswrapper[4781]: W1208 20:21:35.497638 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bc05ac8_a1b5_458d_b1df_ed7fa1908349.slice/crio-c46ae4c6c95bcdec2afd98288820a7e774efe0ee8ca80b860a3642c6911a7e0b WatchSource:0}: Error finding container c46ae4c6c95bcdec2afd98288820a7e774efe0ee8ca80b860a3642c6911a7e0b: Status 404 returned error can't find the container with id c46ae4c6c95bcdec2afd98288820a7e774efe0ee8ca80b860a3642c6911a7e0b Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.727062 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" event={"ID":"3bc05ac8-a1b5-458d-b1df-ed7fa1908349","Type":"ContainerStarted","Data":"c46ae4c6c95bcdec2afd98288820a7e774efe0ee8ca80b860a3642c6911a7e0b"} Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.742059 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.743244 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.747462 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.747593 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.747832 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.747909 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.748023 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.748203 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mftgd" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.748232 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.760358 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.781675 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.781906 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.782000 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgpxx\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-kube-api-access-dgpxx\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.782079 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.782167 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-config-data\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.782250 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9192ae66-92ec-4618-aecd-3ec306da8525-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.782384 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.782502 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.782573 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.782648 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.782733 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9192ae66-92ec-4618-aecd-3ec306da8525-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.783756 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-vwrcm"] Dec 08 20:21:35 crc kubenswrapper[4781]: W1208 20:21:35.788077 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1453615c_ee69_4362_87c3_844a3bb16faf.slice/crio-892b7e90809a24e7d096e7c836f62c95915c1d7e5def08074a1929def2768889 WatchSource:0}: Error finding container 892b7e90809a24e7d096e7c836f62c95915c1d7e5def08074a1929def2768889: Status 404 returned error can't find the container with id 892b7e90809a24e7d096e7c836f62c95915c1d7e5def08074a1929def2768889 Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.883612 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgpxx\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-kube-api-access-dgpxx\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.883684 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.883727 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-config-data\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.883754 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9192ae66-92ec-4618-aecd-3ec306da8525-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.883773 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.883793 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.883837 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.883870 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.883967 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9192ae66-92ec-4618-aecd-3ec306da8525-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.883991 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.884052 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.884604 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.885101 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.885127 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-config-data\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.885665 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.885875 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.886205 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-server-conf\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.893014 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9192ae66-92ec-4618-aecd-3ec306da8525-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.895381 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.897425 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.904871 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgpxx\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-kube-api-access-dgpxx\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.905277 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:35 crc kubenswrapper[4781]: I1208 20:21:35.915116 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9192ae66-92ec-4618-aecd-3ec306da8525-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " pod="openstack/rabbitmq-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.006867 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.012542 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.029114 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.029164 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.029165 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.029129 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.029338 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mkjkl" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.029866 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.030200 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.048738 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.078355 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.348048 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxnll\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-kube-api-access-hxnll\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.348094 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.348117 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/202f9454-de0b-4a09-abb6-dacbea9b5fa4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.348143 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.348160 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.348213 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.348234 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.348259 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.348282 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.348301 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.348360 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/202f9454-de0b-4a09-abb6-dacbea9b5fa4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.449428 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/202f9454-de0b-4a09-abb6-dacbea9b5fa4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.449477 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.449501 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.449532 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.449554 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.449572 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.449592 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.449612 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.449663 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/202f9454-de0b-4a09-abb6-dacbea9b5fa4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.449684 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxnll\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-kube-api-access-hxnll\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.449706 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.449941 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.457315 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.457632 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.458194 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.459027 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.459215 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/202f9454-de0b-4a09-abb6-dacbea9b5fa4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.459468 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.464188 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/202f9454-de0b-4a09-abb6-dacbea9b5fa4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.465522 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.466528 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.473992 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.562720 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxnll\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-kube-api-access-hxnll\") pod \"rabbitmq-cell1-server-0\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.646890 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.742771 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-vwrcm" event={"ID":"1453615c-ee69-4362-87c3-844a3bb16faf","Type":"ContainerStarted","Data":"892b7e90809a24e7d096e7c836f62c95915c1d7e5def08074a1929def2768889"} Dec 08 20:21:36 crc kubenswrapper[4781]: I1208 20:21:36.927906 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 20:21:36 crc kubenswrapper[4781]: W1208 20:21:36.951050 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9192ae66_92ec_4618_aecd_3ec306da8525.slice/crio-82841c65ce8bbb78824ed1dfc40a0711e453dad420f4369c2215c046535bebaa WatchSource:0}: Error finding container 82841c65ce8bbb78824ed1dfc40a0711e453dad420f4369c2215c046535bebaa: Status 404 returned error can't find the container with id 82841c65ce8bbb78824ed1dfc40a0711e453dad420f4369c2215c046535bebaa Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.521255 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 20:21:37 crc kubenswrapper[4781]: W1208 20:21:37.548160 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod202f9454_de0b_4a09_abb6_dacbea9b5fa4.slice/crio-7422fda4dbfbc56963c51e0c3feb53051d5a13bd68dc6ff62ddc57aa36101ded WatchSource:0}: Error finding container 7422fda4dbfbc56963c51e0c3feb53051d5a13bd68dc6ff62ddc57aa36101ded: Status 404 returned error can't find the container with id 7422fda4dbfbc56963c51e0c3feb53051d5a13bd68dc6ff62ddc57aa36101ded Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.633967 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.636549 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.638959 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-djsv7" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.642772 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.644262 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.644747 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.646163 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.648659 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.780014 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.780344 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-config-data-default\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.784411 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.784474 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-kolla-config\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.784498 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.784560 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j98fb\" (UniqueName: \"kubernetes.io/projected/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-kube-api-access-j98fb\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.784638 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.784673 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.788108 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"202f9454-de0b-4a09-abb6-dacbea9b5fa4","Type":"ContainerStarted","Data":"7422fda4dbfbc56963c51e0c3feb53051d5a13bd68dc6ff62ddc57aa36101ded"} Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.794027 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9192ae66-92ec-4618-aecd-3ec306da8525","Type":"ContainerStarted","Data":"82841c65ce8bbb78824ed1dfc40a0711e453dad420f4369c2215c046535bebaa"} Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.885848 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.885933 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.885959 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-config-data-default\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.885985 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.886016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-kolla-config\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.886040 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.886082 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j98fb\" (UniqueName: \"kubernetes.io/projected/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-kube-api-access-j98fb\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.886118 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.887461 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.887748 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-kolla-config\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.887815 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.888107 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.888505 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-config-data-default\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.903952 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.904468 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.925558 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j98fb\" (UniqueName: \"kubernetes.io/projected/88416662-c07f-4d9f-b9cb-7f92d21aaa6f-kube-api-access-j98fb\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.928266 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"88416662-c07f-4d9f-b9cb-7f92d21aaa6f\") " pod="openstack/openstack-galera-0" Dec 08 20:21:37 crc kubenswrapper[4781]: I1208 20:21:37.966346 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.673163 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.777941 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.779437 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.785753 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.786822 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.787417 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-rgngb" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.787805 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.788626 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.806777 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.806839 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/95c2c3c0-0733-4bac-bf28-0805d8c9a499-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.806864 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/95c2c3c0-0733-4bac-bf28-0805d8c9a499-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.806909 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c2c3c0-0733-4bac-bf28-0805d8c9a499-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.806951 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c2c3c0-0733-4bac-bf28-0805d8c9a499-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.806987 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95c2c3c0-0733-4bac-bf28-0805d8c9a499-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.807022 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/95c2c3c0-0733-4bac-bf28-0805d8c9a499-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.807204 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7k8w\" (UniqueName: \"kubernetes.io/projected/95c2c3c0-0733-4bac-bf28-0805d8c9a499-kube-api-access-z7k8w\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.908384 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.908447 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/95c2c3c0-0733-4bac-bf28-0805d8c9a499-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.908464 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/95c2c3c0-0733-4bac-bf28-0805d8c9a499-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.908506 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c2c3c0-0733-4bac-bf28-0805d8c9a499-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.908526 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c2c3c0-0733-4bac-bf28-0805d8c9a499-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.908555 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95c2c3c0-0733-4bac-bf28-0805d8c9a499-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.908584 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/95c2c3c0-0733-4bac-bf28-0805d8c9a499-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.908614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7k8w\" (UniqueName: \"kubernetes.io/projected/95c2c3c0-0733-4bac-bf28-0805d8c9a499-kube-api-access-z7k8w\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.909298 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.909471 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/95c2c3c0-0733-4bac-bf28-0805d8c9a499-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.910118 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/95c2c3c0-0733-4bac-bf28-0805d8c9a499-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.910413 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/95c2c3c0-0733-4bac-bf28-0805d8c9a499-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.911476 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95c2c3c0-0733-4bac-bf28-0805d8c9a499-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.927995 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/95c2c3c0-0733-4bac-bf28-0805d8c9a499-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.928401 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7k8w\" (UniqueName: \"kubernetes.io/projected/95c2c3c0-0733-4bac-bf28-0805d8c9a499-kube-api-access-z7k8w\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.929186 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c2c3c0-0733-4bac-bf28-0805d8c9a499-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:38 crc kubenswrapper[4781]: I1208 20:21:38.952452 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"95c2c3c0-0733-4bac-bf28-0805d8c9a499\") " pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.184016 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.363294 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.364199 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.374527 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.374540 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.382038 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-pbc52" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.412639 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.432608 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.432662 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n99sg\" (UniqueName: \"kubernetes.io/projected/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-kube-api-access-n99sg\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.432730 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-config-data\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.432773 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.432821 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-kolla-config\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.534143 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.534220 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-kolla-config\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.534263 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.534292 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n99sg\" (UniqueName: \"kubernetes.io/projected/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-kube-api-access-n99sg\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.534367 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-config-data\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.535477 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-config-data\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.535721 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-kolla-config\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.616975 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.619120 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.626401 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n99sg\" (UniqueName: \"kubernetes.io/projected/3be01047-1cc6-4ed4-9d41-68b1f67f7a11-kube-api-access-n99sg\") pod \"memcached-0\" (UID: \"3be01047-1cc6-4ed4-9d41-68b1f67f7a11\") " pod="openstack/memcached-0" Dec 08 20:21:39 crc kubenswrapper[4781]: I1208 20:21:39.709518 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 08 20:21:40 crc kubenswrapper[4781]: I1208 20:21:40.869958 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 20:21:40 crc kubenswrapper[4781]: I1208 20:21:40.886090 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 08 20:21:40 crc kubenswrapper[4781]: I1208 20:21:40.890805 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-nqw58" Dec 08 20:21:40 crc kubenswrapper[4781]: I1208 20:21:40.896955 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 20:21:40 crc kubenswrapper[4781]: I1208 20:21:40.977387 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d87bz\" (UniqueName: \"kubernetes.io/projected/2ca70357-2cfc-42dd-a9cf-2c5f992ba62d-kube-api-access-d87bz\") pod \"kube-state-metrics-0\" (UID: \"2ca70357-2cfc-42dd-a9cf-2c5f992ba62d\") " pod="openstack/kube-state-metrics-0" Dec 08 20:21:41 crc kubenswrapper[4781]: I1208 20:21:41.094416 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d87bz\" (UniqueName: \"kubernetes.io/projected/2ca70357-2cfc-42dd-a9cf-2c5f992ba62d-kube-api-access-d87bz\") pod \"kube-state-metrics-0\" (UID: \"2ca70357-2cfc-42dd-a9cf-2c5f992ba62d\") " pod="openstack/kube-state-metrics-0" Dec 08 20:21:41 crc kubenswrapper[4781]: I1208 20:21:41.141043 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d87bz\" (UniqueName: \"kubernetes.io/projected/2ca70357-2cfc-42dd-a9cf-2c5f992ba62d-kube-api-access-d87bz\") pod \"kube-state-metrics-0\" (UID: \"2ca70357-2cfc-42dd-a9cf-2c5f992ba62d\") " pod="openstack/kube-state-metrics-0" Dec 08 20:21:41 crc kubenswrapper[4781]: I1208 20:21:41.268367 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.487893 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kf4ss"] Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.498998 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.499361 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kf4ss"] Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.501518 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.501752 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-jkjfq" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.501766 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.620181 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-9cv7c"] Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.624557 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.627296 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-combined-ca-bundle\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.627617 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-var-run-ovn\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.628009 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-ovn-controller-tls-certs\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.628045 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-var-run\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.628169 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq9zr\" (UniqueName: \"kubernetes.io/projected/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-kube-api-access-gq9zr\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.628249 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-var-log-ovn\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.628416 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-scripts\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.640438 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9cv7c"] Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.729326 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-scripts\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.729392 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-combined-ca-bundle\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.729422 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-var-run-ovn\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.729449 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b675bf96-ecb2-4098-891f-6a87e0ed5140-scripts\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.729481 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b675bf96-ecb2-4098-891f-6a87e0ed5140-var-log\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.729507 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-ovn-controller-tls-certs\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.729522 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-var-run\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.729539 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b675bf96-ecb2-4098-891f-6a87e0ed5140-etc-ovs\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.729559 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq9zr\" (UniqueName: \"kubernetes.io/projected/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-kube-api-access-gq9zr\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.729577 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-var-log-ovn\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.729615 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b675bf96-ecb2-4098-891f-6a87e0ed5140-var-run\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.729636 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b675bf96-ecb2-4098-891f-6a87e0ed5140-var-lib\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.729673 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgjxq\" (UniqueName: \"kubernetes.io/projected/b675bf96-ecb2-4098-891f-6a87e0ed5140-kube-api-access-lgjxq\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.731177 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-var-run-ovn\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.731221 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-var-log-ovn\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.731352 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-var-run\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.733227 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-scripts\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.735748 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-ovn-controller-tls-certs\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.754319 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-combined-ca-bundle\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.754784 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq9zr\" (UniqueName: \"kubernetes.io/projected/a7f5afd4-05f3-4954-9dc9-3efa47c22b85-kube-api-access-gq9zr\") pod \"ovn-controller-kf4ss\" (UID: \"a7f5afd4-05f3-4954-9dc9-3efa47c22b85\") " pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.825473 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kf4ss" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.832046 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b675bf96-ecb2-4098-891f-6a87e0ed5140-scripts\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.832112 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b675bf96-ecb2-4098-891f-6a87e0ed5140-var-log\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.832139 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b675bf96-ecb2-4098-891f-6a87e0ed5140-etc-ovs\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.832179 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b675bf96-ecb2-4098-891f-6a87e0ed5140-var-run\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.832201 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b675bf96-ecb2-4098-891f-6a87e0ed5140-var-lib\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.832238 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgjxq\" (UniqueName: \"kubernetes.io/projected/b675bf96-ecb2-4098-891f-6a87e0ed5140-kube-api-access-lgjxq\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.834590 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b675bf96-ecb2-4098-891f-6a87e0ed5140-scripts\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.834798 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b675bf96-ecb2-4098-891f-6a87e0ed5140-var-log\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.834896 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b675bf96-ecb2-4098-891f-6a87e0ed5140-etc-ovs\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.834953 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b675bf96-ecb2-4098-891f-6a87e0ed5140-var-run\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.835076 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b675bf96-ecb2-4098-891f-6a87e0ed5140-var-lib\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:44 crc kubenswrapper[4781]: I1208 20:21:44.880066 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgjxq\" (UniqueName: \"kubernetes.io/projected/b675bf96-ecb2-4098-891f-6a87e0ed5140-kube-api-access-lgjxq\") pod \"ovn-controller-ovs-9cv7c\" (UID: \"b675bf96-ecb2-4098-891f-6a87e0ed5140\") " pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.004395 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.197452 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.198675 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.201468 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.202336 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.202576 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.202625 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.202740 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-9r9k8" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.212390 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.338098 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3530fc96-f407-470c-a960-c7cfd844c517-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.338151 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfr5b\" (UniqueName: \"kubernetes.io/projected/3530fc96-f407-470c-a960-c7cfd844c517-kube-api-access-nfr5b\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.338190 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3530fc96-f407-470c-a960-c7cfd844c517-config\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.338456 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.338505 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3530fc96-f407-470c-a960-c7cfd844c517-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.338525 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3530fc96-f407-470c-a960-c7cfd844c517-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.338578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3530fc96-f407-470c-a960-c7cfd844c517-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.338594 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3530fc96-f407-470c-a960-c7cfd844c517-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.439577 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfr5b\" (UniqueName: \"kubernetes.io/projected/3530fc96-f407-470c-a960-c7cfd844c517-kube-api-access-nfr5b\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.439629 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3530fc96-f407-470c-a960-c7cfd844c517-config\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.439662 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.439701 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3530fc96-f407-470c-a960-c7cfd844c517-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.439722 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3530fc96-f407-470c-a960-c7cfd844c517-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.439767 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3530fc96-f407-470c-a960-c7cfd844c517-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.439782 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3530fc96-f407-470c-a960-c7cfd844c517-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.439807 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3530fc96-f407-470c-a960-c7cfd844c517-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.440534 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3530fc96-f407-470c-a960-c7cfd844c517-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.440945 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.441776 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3530fc96-f407-470c-a960-c7cfd844c517-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.442812 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3530fc96-f407-470c-a960-c7cfd844c517-config\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.444848 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3530fc96-f407-470c-a960-c7cfd844c517-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.451449 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3530fc96-f407-470c-a960-c7cfd844c517-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.459043 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3530fc96-f407-470c-a960-c7cfd844c517-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.463137 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.465379 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfr5b\" (UniqueName: \"kubernetes.io/projected/3530fc96-f407-470c-a960-c7cfd844c517-kube-api-access-nfr5b\") pod \"ovsdbserver-nb-0\" (UID: \"3530fc96-f407-470c-a960-c7cfd844c517\") " pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:45 crc kubenswrapper[4781]: I1208 20:21:45.520774 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.392753 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.394299 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.396675 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-mkhch" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.397117 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.397185 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.397601 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.410574 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.490633 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.490731 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.490763 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.490791 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.490960 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds6jn\" (UniqueName: \"kubernetes.io/projected/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-kube-api-access-ds6jn\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.490996 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.491019 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.491041 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-config\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.594221 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds6jn\" (UniqueName: \"kubernetes.io/projected/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-kube-api-access-ds6jn\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.594267 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.594285 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.594304 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-config\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.594345 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.594377 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.594393 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.594409 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.594699 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.595304 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-config\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.595334 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.595785 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.599946 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.600615 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.603593 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.617068 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds6jn\" (UniqueName: \"kubernetes.io/projected/5544d7c5-67c2-4f2e-9e0f-d8307d831d5d-kube-api-access-ds6jn\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.626424 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-sb-0\" (UID: \"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d\") " pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:47 crc kubenswrapper[4781]: I1208 20:21:47.715076 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 08 20:21:51 crc kubenswrapper[4781]: W1208 20:21:51.738229 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88416662_c07f_4d9f_b9cb_7f92d21aaa6f.slice/crio-99ce81dfebea1b5bf167f0ed15c91553f68095c4f319c62fe7b98a96e05bb8f0 WatchSource:0}: Error finding container 99ce81dfebea1b5bf167f0ed15c91553f68095c4f319c62fe7b98a96e05bb8f0: Status 404 returned error can't find the container with id 99ce81dfebea1b5bf167f0ed15c91553f68095c4f319c62fe7b98a96e05bb8f0 Dec 08 20:21:51 crc kubenswrapper[4781]: I1208 20:21:51.932118 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"88416662-c07f-4d9f-b9cb-7f92d21aaa6f","Type":"ContainerStarted","Data":"99ce81dfebea1b5bf167f0ed15c91553f68095c4f319c62fe7b98a96e05bb8f0"} Dec 08 20:21:57 crc kubenswrapper[4781]: E1208 20:21:57.087187 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d" Dec 08 20:21:57 crc kubenswrapper[4781]: E1208 20:21:57.087852 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hxnll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(202f9454-de0b-4a09-abb6-dacbea9b5fa4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:21:57 crc kubenswrapper[4781]: E1208 20:21:57.091057 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="202f9454-de0b-4a09-abb6-dacbea9b5fa4" Dec 08 20:21:57 crc kubenswrapper[4781]: E1208 20:21:57.102876 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d" Dec 08 20:21:57 crc kubenswrapper[4781]: E1208 20:21:57.103066 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dgpxx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(9192ae66-92ec-4618-aecd-3ec306da8525): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:21:57 crc kubenswrapper[4781]: E1208 20:21:57.104586 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="9192ae66-92ec-4618-aecd-3ec306da8525" Dec 08 20:21:57 crc kubenswrapper[4781]: I1208 20:21:57.564604 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 08 20:21:57 crc kubenswrapper[4781]: E1208 20:21:57.981364 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d\\\"\"" pod="openstack/rabbitmq-server-0" podUID="9192ae66-92ec-4618-aecd-3ec306da8525" Dec 08 20:21:57 crc kubenswrapper[4781]: E1208 20:21:57.981574 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:c2e607504d3c90c201f29ca74f0ee51de1ef2c5444d597d2ff3160ab9b3ce84d\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="202f9454-de0b-4a09-abb6-dacbea9b5fa4" Dec 08 20:21:59 crc kubenswrapper[4781]: I1208 20:21:59.947721 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:21:59 crc kubenswrapper[4781]: I1208 20:21:59.948073 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:22:01 crc kubenswrapper[4781]: W1208 20:22:01.736638 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3530fc96_f407_470c_a960_c7cfd844c517.slice/crio-80073c4144d523e16dcec307b6cef8ac7310ef9aa2635158f113c9d2ac6e6e4b WatchSource:0}: Error finding container 80073c4144d523e16dcec307b6cef8ac7310ef9aa2635158f113c9d2ac6e6e4b: Status 404 returned error can't find the container with id 80073c4144d523e16dcec307b6cef8ac7310ef9aa2635158f113c9d2ac6e6e4b Dec 08 20:22:02 crc kubenswrapper[4781]: I1208 20:22:02.006597 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3530fc96-f407-470c-a960-c7cfd844c517","Type":"ContainerStarted","Data":"80073c4144d523e16dcec307b6cef8ac7310ef9aa2635158f113c9d2ac6e6e4b"} Dec 08 20:22:02 crc kubenswrapper[4781]: I1208 20:22:02.374739 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 08 20:22:02 crc kubenswrapper[4781]: I1208 20:22:02.451512 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 08 20:22:02 crc kubenswrapper[4781]: E1208 20:22:02.747971 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 08 20:22:02 crc kubenswrapper[4781]: E1208 20:22:02.748217 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x27lh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-567c455747-jb8jk_openstack(8d6cf391-8ec5-42dc-9618-89d928d6a85c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:22:02 crc kubenswrapper[4781]: E1208 20:22:02.752083 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-567c455747-jb8jk" podUID="8d6cf391-8ec5-42dc-9618-89d928d6a85c" Dec 08 20:22:02 crc kubenswrapper[4781]: E1208 20:22:02.757059 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 08 20:22:02 crc kubenswrapper[4781]: E1208 20:22:02.757225 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z54hz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bc4b48fc9-4qvh7_openstack(3bc05ac8-a1b5-458d-b1df-ed7fa1908349): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:22:02 crc kubenswrapper[4781]: E1208 20:22:02.758536 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" podUID="3bc05ac8-a1b5-458d-b1df-ed7fa1908349" Dec 08 20:22:02 crc kubenswrapper[4781]: E1208 20:22:02.775149 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 08 20:22:02 crc kubenswrapper[4781]: E1208 20:22:02.775354 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z6z2l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5cd484bb89-fv7pt_openstack(934f62c5-5393-4fbc-919b-aa66c1764fde): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:22:02 crc kubenswrapper[4781]: E1208 20:22:02.776656 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5cd484bb89-fv7pt" podUID="934f62c5-5393-4fbc-919b-aa66c1764fde" Dec 08 20:22:03 crc kubenswrapper[4781]: E1208 20:22:03.016667 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 08 20:22:03 crc kubenswrapper[4781]: E1208 20:22:03.016836 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xtbcv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-cb666b895-vwrcm_openstack(1453615c-ee69-4362-87c3-844a3bb16faf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:22:03 crc kubenswrapper[4781]: E1208 20:22:03.017949 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-cb666b895-vwrcm" podUID="1453615c-ee69-4362-87c3-844a3bb16faf" Dec 08 20:22:03 crc kubenswrapper[4781]: E1208 20:22:03.018649 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792\\\"\"" pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" podUID="3bc05ac8-a1b5-458d-b1df-ed7fa1908349" Dec 08 20:22:03 crc kubenswrapper[4781]: I1208 20:22:03.433413 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9cv7c"] Dec 08 20:22:03 crc kubenswrapper[4781]: I1208 20:22:03.464845 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 20:22:04 crc kubenswrapper[4781]: E1208 20:22:04.022498 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792\\\"\"" pod="openstack/dnsmasq-dns-cb666b895-vwrcm" podUID="1453615c-ee69-4362-87c3-844a3bb16faf" Dec 08 20:22:05 crc kubenswrapper[4781]: W1208 20:22:05.140195 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ca70357_2cfc_42dd_a9cf_2c5f992ba62d.slice/crio-624f207844f6986613ec2519fde6b56c2758d417b48bb2dc53d007a4d728f4a7 WatchSource:0}: Error finding container 624f207844f6986613ec2519fde6b56c2758d417b48bb2dc53d007a4d728f4a7: Status 404 returned error can't find the container with id 624f207844f6986613ec2519fde6b56c2758d417b48bb2dc53d007a4d728f4a7 Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.258326 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-fv7pt" Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.259907 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-jb8jk" Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.427811 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6z2l\" (UniqueName: \"kubernetes.io/projected/934f62c5-5393-4fbc-919b-aa66c1764fde-kube-api-access-z6z2l\") pod \"934f62c5-5393-4fbc-919b-aa66c1764fde\" (UID: \"934f62c5-5393-4fbc-919b-aa66c1764fde\") " Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.428008 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d6cf391-8ec5-42dc-9618-89d928d6a85c-dns-svc\") pod \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\" (UID: \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\") " Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.428249 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x27lh\" (UniqueName: \"kubernetes.io/projected/8d6cf391-8ec5-42dc-9618-89d928d6a85c-kube-api-access-x27lh\") pod \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\" (UID: \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\") " Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.428548 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d6cf391-8ec5-42dc-9618-89d928d6a85c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8d6cf391-8ec5-42dc-9618-89d928d6a85c" (UID: "8d6cf391-8ec5-42dc-9618-89d928d6a85c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.428707 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934f62c5-5393-4fbc-919b-aa66c1764fde-config\") pod \"934f62c5-5393-4fbc-919b-aa66c1764fde\" (UID: \"934f62c5-5393-4fbc-919b-aa66c1764fde\") " Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.428740 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6cf391-8ec5-42dc-9618-89d928d6a85c-config\") pod \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\" (UID: \"8d6cf391-8ec5-42dc-9618-89d928d6a85c\") " Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.429159 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d6cf391-8ec5-42dc-9618-89d928d6a85c-config" (OuterVolumeSpecName: "config") pod "8d6cf391-8ec5-42dc-9618-89d928d6a85c" (UID: "8d6cf391-8ec5-42dc-9618-89d928d6a85c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.429199 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/934f62c5-5393-4fbc-919b-aa66c1764fde-config" (OuterVolumeSpecName: "config") pod "934f62c5-5393-4fbc-919b-aa66c1764fde" (UID: "934f62c5-5393-4fbc-919b-aa66c1764fde"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.429873 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/934f62c5-5393-4fbc-919b-aa66c1764fde-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.429893 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d6cf391-8ec5-42dc-9618-89d928d6a85c-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.429903 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d6cf391-8ec5-42dc-9618-89d928d6a85c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.432643 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/934f62c5-5393-4fbc-919b-aa66c1764fde-kube-api-access-z6z2l" (OuterVolumeSpecName: "kube-api-access-z6z2l") pod "934f62c5-5393-4fbc-919b-aa66c1764fde" (UID: "934f62c5-5393-4fbc-919b-aa66c1764fde"). InnerVolumeSpecName "kube-api-access-z6z2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.432706 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6cf391-8ec5-42dc-9618-89d928d6a85c-kube-api-access-x27lh" (OuterVolumeSpecName: "kube-api-access-x27lh") pod "8d6cf391-8ec5-42dc-9618-89d928d6a85c" (UID: "8d6cf391-8ec5-42dc-9618-89d928d6a85c"). InnerVolumeSpecName "kube-api-access-x27lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.531665 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6z2l\" (UniqueName: \"kubernetes.io/projected/934f62c5-5393-4fbc-919b-aa66c1764fde-kube-api-access-z6z2l\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.532664 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x27lh\" (UniqueName: \"kubernetes.io/projected/8d6cf391-8ec5-42dc-9618-89d928d6a85c-kube-api-access-x27lh\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.639454 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kf4ss"] Dec 08 20:22:05 crc kubenswrapper[4781]: I1208 20:22:05.724117 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 08 20:22:05 crc kubenswrapper[4781]: W1208 20:22:05.729813 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5544d7c5_67c2_4f2e_9e0f_d8307d831d5d.slice/crio-6ea94f1210a0f645b33ee7b3bef8d1d06aa6293413c7549460038453cdec5e16 WatchSource:0}: Error finding container 6ea94f1210a0f645b33ee7b3bef8d1d06aa6293413c7549460038453cdec5e16: Status 404 returned error can't find the container with id 6ea94f1210a0f645b33ee7b3bef8d1d06aa6293413c7549460038453cdec5e16 Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.061347 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3be01047-1cc6-4ed4-9d41-68b1f67f7a11","Type":"ContainerStarted","Data":"f7f35a1d72fd8de781d6dcb1666e545ff102f56a6ed0d59570d3832f1c1f4134"} Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.063612 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9cv7c" event={"ID":"b675bf96-ecb2-4098-891f-6a87e0ed5140","Type":"ContainerStarted","Data":"f9fa4bf80b27a6cb3901c1c0ce72db3a63cb683664863c60ba516efc2ea242a9"} Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.065282 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-fv7pt" event={"ID":"934f62c5-5393-4fbc-919b-aa66c1764fde","Type":"ContainerDied","Data":"124d9ae748add665e2e095b48f9356055f530463ffab226c9423fb8a2ac9f263"} Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.065440 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-fv7pt" Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.066284 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d","Type":"ContainerStarted","Data":"6ea94f1210a0f645b33ee7b3bef8d1d06aa6293413c7549460038453cdec5e16"} Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.079436 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"88416662-c07f-4d9f-b9cb-7f92d21aaa6f","Type":"ContainerStarted","Data":"131e852e3cb5f9aca9b0f0ec2670648e8b6132aca6b589f3d4101214c760bd90"} Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.084069 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-jb8jk" event={"ID":"8d6cf391-8ec5-42dc-9618-89d928d6a85c","Type":"ContainerDied","Data":"c9ab1cc443b50a1ea68d1e8e941585a337de71f655a753d17a9e92e983a97154"} Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.084130 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-jb8jk" Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.094874 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"95c2c3c0-0733-4bac-bf28-0805d8c9a499","Type":"ContainerStarted","Data":"ba0b470d8b625fe0bbb6c33b8fa701f35487ee7beafc974a2d189addd6398533"} Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.094950 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"95c2c3c0-0733-4bac-bf28-0805d8c9a499","Type":"ContainerStarted","Data":"5cee414b09d2aa919a5d16ae8bf8eea2648ce5982e137ec28774ddef26a57190"} Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.105709 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2ca70357-2cfc-42dd-a9cf-2c5f992ba62d","Type":"ContainerStarted","Data":"624f207844f6986613ec2519fde6b56c2758d417b48bb2dc53d007a4d728f4a7"} Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.107615 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kf4ss" event={"ID":"a7f5afd4-05f3-4954-9dc9-3efa47c22b85","Type":"ContainerStarted","Data":"1f6678e32e7b6d054458c1533f62edfeca0bb184e54baa89cc3f7a25e27110e1"} Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.268703 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-fv7pt"] Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.276472 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-fv7pt"] Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.596677 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-jb8jk"] Dec 08 20:22:06 crc kubenswrapper[4781]: I1208 20:22:06.602282 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-567c455747-jb8jk"] Dec 08 20:22:07 crc kubenswrapper[4781]: I1208 20:22:07.836759 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-9w8ql"] Dec 08 20:22:07 crc kubenswrapper[4781]: I1208 20:22:07.838086 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:07 crc kubenswrapper[4781]: I1208 20:22:07.840296 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 08 20:22:07 crc kubenswrapper[4781]: I1208 20:22:07.846163 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9w8ql"] Dec 08 20:22:07 crc kubenswrapper[4781]: I1208 20:22:07.948367 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d8cf2d7-e85f-49e8-95e2-c1548f506888-combined-ca-bundle\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:07 crc kubenswrapper[4781]: I1208 20:22:07.948689 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0d8cf2d7-e85f-49e8-95e2-c1548f506888-ovs-rundir\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:07 crc kubenswrapper[4781]: I1208 20:22:07.948750 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztvnh\" (UniqueName: \"kubernetes.io/projected/0d8cf2d7-e85f-49e8-95e2-c1548f506888-kube-api-access-ztvnh\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:07 crc kubenswrapper[4781]: I1208 20:22:07.948816 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d8cf2d7-e85f-49e8-95e2-c1548f506888-config\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:07 crc kubenswrapper[4781]: I1208 20:22:07.948842 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d8cf2d7-e85f-49e8-95e2-c1548f506888-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:07 crc kubenswrapper[4781]: I1208 20:22:07.948865 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0d8cf2d7-e85f-49e8-95e2-c1548f506888-ovn-rundir\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:07 crc kubenswrapper[4781]: I1208 20:22:07.991418 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-4qvh7"] Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.036023 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-chtjj"] Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.037565 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.040369 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.044800 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-chtjj"] Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.049939 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0d8cf2d7-e85f-49e8-95e2-c1548f506888-ovn-rundir\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.050014 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d8cf2d7-e85f-49e8-95e2-c1548f506888-combined-ca-bundle\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.050054 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0d8cf2d7-e85f-49e8-95e2-c1548f506888-ovs-rundir\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.050123 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztvnh\" (UniqueName: \"kubernetes.io/projected/0d8cf2d7-e85f-49e8-95e2-c1548f506888-kube-api-access-ztvnh\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.050204 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d8cf2d7-e85f-49e8-95e2-c1548f506888-config\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.050241 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d8cf2d7-e85f-49e8-95e2-c1548f506888-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.050480 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0d8cf2d7-e85f-49e8-95e2-c1548f506888-ovn-rundir\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.050485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0d8cf2d7-e85f-49e8-95e2-c1548f506888-ovs-rundir\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.051162 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d8cf2d7-e85f-49e8-95e2-c1548f506888-config\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.059583 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d8cf2d7-e85f-49e8-95e2-c1548f506888-combined-ca-bundle\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.060314 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d8cf2d7-e85f-49e8-95e2-c1548f506888-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.087373 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztvnh\" (UniqueName: \"kubernetes.io/projected/0d8cf2d7-e85f-49e8-95e2-c1548f506888-kube-api-access-ztvnh\") pod \"ovn-controller-metrics-9w8ql\" (UID: \"0d8cf2d7-e85f-49e8-95e2-c1548f506888\") " pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.140228 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d6cf391-8ec5-42dc-9618-89d928d6a85c" path="/var/lib/kubelet/pods/8d6cf391-8ec5-42dc-9618-89d928d6a85c/volumes" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.140784 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="934f62c5-5393-4fbc-919b-aa66c1764fde" path="/var/lib/kubelet/pods/934f62c5-5393-4fbc-919b-aa66c1764fde/volumes" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.151213 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-config\") pod \"dnsmasq-dns-57db9b5bc9-chtjj\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.151334 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bbt2\" (UniqueName: \"kubernetes.io/projected/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-kube-api-access-9bbt2\") pod \"dnsmasq-dns-57db9b5bc9-chtjj\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.151386 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-dns-svc\") pod \"dnsmasq-dns-57db9b5bc9-chtjj\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.151463 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-ovsdbserver-nb\") pod \"dnsmasq-dns-57db9b5bc9-chtjj\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.172016 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9w8ql" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.252743 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-config\") pod \"dnsmasq-dns-57db9b5bc9-chtjj\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.252904 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bbt2\" (UniqueName: \"kubernetes.io/projected/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-kube-api-access-9bbt2\") pod \"dnsmasq-dns-57db9b5bc9-chtjj\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.253227 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-dns-svc\") pod \"dnsmasq-dns-57db9b5bc9-chtjj\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.253326 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-ovsdbserver-nb\") pod \"dnsmasq-dns-57db9b5bc9-chtjj\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.256336 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-ovsdbserver-nb\") pod \"dnsmasq-dns-57db9b5bc9-chtjj\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.256364 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-dns-svc\") pod \"dnsmasq-dns-57db9b5bc9-chtjj\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.257399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-config\") pod \"dnsmasq-dns-57db9b5bc9-chtjj\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.284497 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bbt2\" (UniqueName: \"kubernetes.io/projected/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-kube-api-access-9bbt2\") pod \"dnsmasq-dns-57db9b5bc9-chtjj\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.313057 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-vwrcm"] Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.421369 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-h7t2l"] Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.429944 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.430866 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-h7t2l"] Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.443219 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.453331 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.562788 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-ovsdbserver-sb\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.562837 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-config\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.562863 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-dns-svc\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.563028 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-ovsdbserver-nb\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.563102 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kcnl\" (UniqueName: \"kubernetes.io/projected/739c8ff6-4b97-4e48-8418-a06815318dc5-kube-api-access-8kcnl\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.664398 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-ovsdbserver-sb\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.664457 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-config\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.664486 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-dns-svc\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.664537 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-ovsdbserver-nb\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.664565 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kcnl\" (UniqueName: \"kubernetes.io/projected/739c8ff6-4b97-4e48-8418-a06815318dc5-kube-api-access-8kcnl\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.665592 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-ovsdbserver-sb\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.666204 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-config\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.666848 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-dns-svc\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.667336 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-ovsdbserver-nb\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.683060 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kcnl\" (UniqueName: \"kubernetes.io/projected/739c8ff6-4b97-4e48-8418-a06815318dc5-kube-api-access-8kcnl\") pod \"dnsmasq-dns-db7757ddc-h7t2l\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.767615 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.798069 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.809064 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-vwrcm" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.867672 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z54hz\" (UniqueName: \"kubernetes.io/projected/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-kube-api-access-z54hz\") pod \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\" (UID: \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\") " Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.868010 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-config\") pod \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\" (UID: \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\") " Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.868039 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1453615c-ee69-4362-87c3-844a3bb16faf-config\") pod \"1453615c-ee69-4362-87c3-844a3bb16faf\" (UID: \"1453615c-ee69-4362-87c3-844a3bb16faf\") " Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.868083 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1453615c-ee69-4362-87c3-844a3bb16faf-dns-svc\") pod \"1453615c-ee69-4362-87c3-844a3bb16faf\" (UID: \"1453615c-ee69-4362-87c3-844a3bb16faf\") " Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.868183 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtbcv\" (UniqueName: \"kubernetes.io/projected/1453615c-ee69-4362-87c3-844a3bb16faf-kube-api-access-xtbcv\") pod \"1453615c-ee69-4362-87c3-844a3bb16faf\" (UID: \"1453615c-ee69-4362-87c3-844a3bb16faf\") " Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.868242 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-dns-svc\") pod \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\" (UID: \"3bc05ac8-a1b5-458d-b1df-ed7fa1908349\") " Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.868360 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-config" (OuterVolumeSpecName: "config") pod "3bc05ac8-a1b5-458d-b1df-ed7fa1908349" (UID: "3bc05ac8-a1b5-458d-b1df-ed7fa1908349"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.868702 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3bc05ac8-a1b5-458d-b1df-ed7fa1908349" (UID: "3bc05ac8-a1b5-458d-b1df-ed7fa1908349"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.868712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1453615c-ee69-4362-87c3-844a3bb16faf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1453615c-ee69-4362-87c3-844a3bb16faf" (UID: "1453615c-ee69-4362-87c3-844a3bb16faf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.868795 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.869023 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1453615c-ee69-4362-87c3-844a3bb16faf-config" (OuterVolumeSpecName: "config") pod "1453615c-ee69-4362-87c3-844a3bb16faf" (UID: "1453615c-ee69-4362-87c3-844a3bb16faf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.872857 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1453615c-ee69-4362-87c3-844a3bb16faf-kube-api-access-xtbcv" (OuterVolumeSpecName: "kube-api-access-xtbcv") pod "1453615c-ee69-4362-87c3-844a3bb16faf" (UID: "1453615c-ee69-4362-87c3-844a3bb16faf"). InnerVolumeSpecName "kube-api-access-xtbcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.873357 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-kube-api-access-z54hz" (OuterVolumeSpecName: "kube-api-access-z54hz") pod "3bc05ac8-a1b5-458d-b1df-ed7fa1908349" (UID: "3bc05ac8-a1b5-458d-b1df-ed7fa1908349"). InnerVolumeSpecName "kube-api-access-z54hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.970475 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtbcv\" (UniqueName: \"kubernetes.io/projected/1453615c-ee69-4362-87c3-844a3bb16faf-kube-api-access-xtbcv\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.970512 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.970528 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z54hz\" (UniqueName: \"kubernetes.io/projected/3bc05ac8-a1b5-458d-b1df-ed7fa1908349-kube-api-access-z54hz\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.970540 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1453615c-ee69-4362-87c3-844a3bb16faf-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:08 crc kubenswrapper[4781]: I1208 20:22:08.970552 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1453615c-ee69-4362-87c3-844a3bb16faf-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:09 crc kubenswrapper[4781]: I1208 20:22:09.146939 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" event={"ID":"3bc05ac8-a1b5-458d-b1df-ed7fa1908349","Type":"ContainerDied","Data":"c46ae4c6c95bcdec2afd98288820a7e774efe0ee8ca80b860a3642c6911a7e0b"} Dec 08 20:22:09 crc kubenswrapper[4781]: I1208 20:22:09.147050 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-4qvh7" Dec 08 20:22:09 crc kubenswrapper[4781]: I1208 20:22:09.158214 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-vwrcm" event={"ID":"1453615c-ee69-4362-87c3-844a3bb16faf","Type":"ContainerDied","Data":"892b7e90809a24e7d096e7c836f62c95915c1d7e5def08074a1929def2768889"} Dec 08 20:22:09 crc kubenswrapper[4781]: I1208 20:22:09.158281 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-vwrcm" Dec 08 20:22:09 crc kubenswrapper[4781]: I1208 20:22:09.214611 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-vwrcm"] Dec 08 20:22:09 crc kubenswrapper[4781]: I1208 20:22:09.222533 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-vwrcm"] Dec 08 20:22:09 crc kubenswrapper[4781]: I1208 20:22:09.255981 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-4qvh7"] Dec 08 20:22:09 crc kubenswrapper[4781]: I1208 20:22:09.263215 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-4qvh7"] Dec 08 20:22:10 crc kubenswrapper[4781]: I1208 20:22:10.135832 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1453615c-ee69-4362-87c3-844a3bb16faf" path="/var/lib/kubelet/pods/1453615c-ee69-4362-87c3-844a3bb16faf/volumes" Dec 08 20:22:10 crc kubenswrapper[4781]: I1208 20:22:10.136657 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc05ac8-a1b5-458d-b1df-ed7fa1908349" path="/var/lib/kubelet/pods/3bc05ac8-a1b5-458d-b1df-ed7fa1908349/volumes" Dec 08 20:22:10 crc kubenswrapper[4781]: I1208 20:22:10.168378 4781 generic.go:334] "Generic (PLEG): container finished" podID="88416662-c07f-4d9f-b9cb-7f92d21aaa6f" containerID="131e852e3cb5f9aca9b0f0ec2670648e8b6132aca6b589f3d4101214c760bd90" exitCode=0 Dec 08 20:22:10 crc kubenswrapper[4781]: I1208 20:22:10.168421 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"88416662-c07f-4d9f-b9cb-7f92d21aaa6f","Type":"ContainerDied","Data":"131e852e3cb5f9aca9b0f0ec2670648e8b6132aca6b589f3d4101214c760bd90"} Dec 08 20:22:11 crc kubenswrapper[4781]: I1208 20:22:11.189618 4781 generic.go:334] "Generic (PLEG): container finished" podID="95c2c3c0-0733-4bac-bf28-0805d8c9a499" containerID="ba0b470d8b625fe0bbb6c33b8fa701f35487ee7beafc974a2d189addd6398533" exitCode=0 Dec 08 20:22:11 crc kubenswrapper[4781]: I1208 20:22:11.189668 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"95c2c3c0-0733-4bac-bf28-0805d8c9a499","Type":"ContainerDied","Data":"ba0b470d8b625fe0bbb6c33b8fa701f35487ee7beafc974a2d189addd6398533"} Dec 08 20:22:12 crc kubenswrapper[4781]: I1208 20:22:12.657700 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-h7t2l"] Dec 08 20:22:12 crc kubenswrapper[4781]: I1208 20:22:12.836073 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9w8ql"] Dec 08 20:22:12 crc kubenswrapper[4781]: I1208 20:22:12.845107 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-chtjj"] Dec 08 20:22:13 crc kubenswrapper[4781]: W1208 20:22:13.140354 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d8cf2d7_e85f_49e8_95e2_c1548f506888.slice/crio-f4c296ef65ff005ba43e9c1ddd846911e6f79d0daa8d5fe435244164cae1ae5b WatchSource:0}: Error finding container f4c296ef65ff005ba43e9c1ddd846911e6f79d0daa8d5fe435244164cae1ae5b: Status 404 returned error can't find the container with id f4c296ef65ff005ba43e9c1ddd846911e6f79d0daa8d5fe435244164cae1ae5b Dec 08 20:22:13 crc kubenswrapper[4781]: W1208 20:22:13.157379 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc65c4faf_ee79_4c01_98fc_d7ad607e6da4.slice/crio-5c0639cbc73e8225f3d81e37d249728e8aeaa348a609577cc5f0d8e5e07b1605 WatchSource:0}: Error finding container 5c0639cbc73e8225f3d81e37d249728e8aeaa348a609577cc5f0d8e5e07b1605: Status 404 returned error can't find the container with id 5c0639cbc73e8225f3d81e37d249728e8aeaa348a609577cc5f0d8e5e07b1605 Dec 08 20:22:13 crc kubenswrapper[4781]: I1208 20:22:13.207334 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9w8ql" event={"ID":"0d8cf2d7-e85f-49e8-95e2-c1548f506888","Type":"ContainerStarted","Data":"f4c296ef65ff005ba43e9c1ddd846911e6f79d0daa8d5fe435244164cae1ae5b"} Dec 08 20:22:13 crc kubenswrapper[4781]: I1208 20:22:13.208621 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" event={"ID":"739c8ff6-4b97-4e48-8418-a06815318dc5","Type":"ContainerStarted","Data":"bcaa7325c1f3624012c89f97d52aa5a25277f0d46e931f3d61877f13a656eb7d"} Dec 08 20:22:13 crc kubenswrapper[4781]: I1208 20:22:13.210643 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"88416662-c07f-4d9f-b9cb-7f92d21aaa6f","Type":"ContainerStarted","Data":"c7bdb44d6a43d3b716f216e740674de06e29aab0f0620f4cb80acbf33fc5001e"} Dec 08 20:22:13 crc kubenswrapper[4781]: I1208 20:22:13.211808 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" event={"ID":"c65c4faf-ee79-4c01-98fc-d7ad607e6da4","Type":"ContainerStarted","Data":"5c0639cbc73e8225f3d81e37d249728e8aeaa348a609577cc5f0d8e5e07b1605"} Dec 08 20:22:13 crc kubenswrapper[4781]: I1208 20:22:13.236950 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=23.792896759 podStartE2EDuration="37.236911338s" podCreationTimestamp="2025-12-08 20:21:36 +0000 UTC" firstStartedPulling="2025-12-08 20:21:51.749110666 +0000 UTC m=+1027.900394043" lastFinishedPulling="2025-12-08 20:22:05.193125245 +0000 UTC m=+1041.344408622" observedRunningTime="2025-12-08 20:22:13.230096583 +0000 UTC m=+1049.381380070" watchObservedRunningTime="2025-12-08 20:22:13.236911338 +0000 UTC m=+1049.388194715" Dec 08 20:22:14 crc kubenswrapper[4781]: I1208 20:22:14.227400 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2ca70357-2cfc-42dd-a9cf-2c5f992ba62d","Type":"ContainerStarted","Data":"66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98"} Dec 08 20:22:14 crc kubenswrapper[4781]: I1208 20:22:14.229020 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kf4ss" event={"ID":"a7f5afd4-05f3-4954-9dc9-3efa47c22b85","Type":"ContainerStarted","Data":"aaef7eb0c279c8f8d0fd67a092608129d47461aec9043ca00c52f1aa8ff7c143"} Dec 08 20:22:14 crc kubenswrapper[4781]: I1208 20:22:14.231396 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3530fc96-f407-470c-a960-c7cfd844c517","Type":"ContainerStarted","Data":"6cd9c26766e01596b215ee8e7bdcc0e4d9ad0d8a5c6cfc6ae2ebb4ba86e7c921"} Dec 08 20:22:14 crc kubenswrapper[4781]: I1208 20:22:14.232841 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" event={"ID":"739c8ff6-4b97-4e48-8418-a06815318dc5","Type":"ContainerStarted","Data":"08b79665ee5d67bdfb2450aac35e4469f658d2c61fc3137d40a4ff5689199b68"} Dec 08 20:22:14 crc kubenswrapper[4781]: I1208 20:22:14.235666 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3be01047-1cc6-4ed4-9d41-68b1f67f7a11","Type":"ContainerStarted","Data":"f957095482d6a672b33b34757670eed3a9d8a91c4f9ec0aead952daffeb4b4bf"} Dec 08 20:22:14 crc kubenswrapper[4781]: I1208 20:22:14.235799 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 08 20:22:14 crc kubenswrapper[4781]: I1208 20:22:14.236949 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9cv7c" event={"ID":"b675bf96-ecb2-4098-891f-6a87e0ed5140","Type":"ContainerStarted","Data":"f5c0c640a4f44ca8fc6b97d34e74adc53a597a4a98dd6bf916bc0bd40651d921"} Dec 08 20:22:14 crc kubenswrapper[4781]: I1208 20:22:14.239224 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"95c2c3c0-0733-4bac-bf28-0805d8c9a499","Type":"ContainerStarted","Data":"121cb93ca752b99105722d3d8535e911559a6101ccf2a8768d36019ec5e69c10"} Dec 08 20:22:14 crc kubenswrapper[4781]: I1208 20:22:14.240479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d","Type":"ContainerStarted","Data":"369f39acf6494a8aae3f0331c9b7ca6792a035fe17e3489b96ec07dd3e26b00b"} Dec 08 20:22:14 crc kubenswrapper[4781]: I1208 20:22:14.291119 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=36.89711608 podStartE2EDuration="37.291105685s" podCreationTimestamp="2025-12-08 20:21:37 +0000 UTC" firstStartedPulling="2025-12-08 20:22:05.093075162 +0000 UTC m=+1041.244358539" lastFinishedPulling="2025-12-08 20:22:05.487064767 +0000 UTC m=+1041.638348144" observedRunningTime="2025-12-08 20:22:14.290819076 +0000 UTC m=+1050.442102453" watchObservedRunningTime="2025-12-08 20:22:14.291105685 +0000 UTC m=+1050.442389062" Dec 08 20:22:14 crc kubenswrapper[4781]: I1208 20:22:14.317584 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=28.447004615 podStartE2EDuration="35.317549634s" podCreationTimestamp="2025-12-08 20:21:39 +0000 UTC" firstStartedPulling="2025-12-08 20:22:05.063814831 +0000 UTC m=+1041.215098218" lastFinishedPulling="2025-12-08 20:22:11.93435986 +0000 UTC m=+1048.085643237" observedRunningTime="2025-12-08 20:22:14.310128321 +0000 UTC m=+1050.461411698" watchObservedRunningTime="2025-12-08 20:22:14.317549634 +0000 UTC m=+1050.468833011" Dec 08 20:22:15 crc kubenswrapper[4781]: I1208 20:22:15.255976 4781 generic.go:334] "Generic (PLEG): container finished" podID="739c8ff6-4b97-4e48-8418-a06815318dc5" containerID="08b79665ee5d67bdfb2450aac35e4469f658d2c61fc3137d40a4ff5689199b68" exitCode=0 Dec 08 20:22:15 crc kubenswrapper[4781]: I1208 20:22:15.256215 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" event={"ID":"739c8ff6-4b97-4e48-8418-a06815318dc5","Type":"ContainerDied","Data":"08b79665ee5d67bdfb2450aac35e4469f658d2c61fc3137d40a4ff5689199b68"} Dec 08 20:22:15 crc kubenswrapper[4781]: I1208 20:22:15.259883 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-kf4ss" Dec 08 20:22:15 crc kubenswrapper[4781]: I1208 20:22:15.303474 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-kf4ss" podStartSLOduration=23.83542556 podStartE2EDuration="31.303456889s" podCreationTimestamp="2025-12-08 20:21:44 +0000 UTC" firstStartedPulling="2025-12-08 20:22:05.667702244 +0000 UTC m=+1041.818985621" lastFinishedPulling="2025-12-08 20:22:13.135733573 +0000 UTC m=+1049.287016950" observedRunningTime="2025-12-08 20:22:15.296941632 +0000 UTC m=+1051.448225009" watchObservedRunningTime="2025-12-08 20:22:15.303456889 +0000 UTC m=+1051.454740256" Dec 08 20:22:17 crc kubenswrapper[4781]: I1208 20:22:17.293155 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=29.201343639 podStartE2EDuration="37.293136341s" podCreationTimestamp="2025-12-08 20:21:40 +0000 UTC" firstStartedPulling="2025-12-08 20:22:05.163991558 +0000 UTC m=+1041.315274925" lastFinishedPulling="2025-12-08 20:22:13.25578424 +0000 UTC m=+1049.407067627" observedRunningTime="2025-12-08 20:22:17.289990741 +0000 UTC m=+1053.441274138" watchObservedRunningTime="2025-12-08 20:22:17.293136341 +0000 UTC m=+1053.444419718" Dec 08 20:22:17 crc kubenswrapper[4781]: I1208 20:22:17.967386 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 08 20:22:17 crc kubenswrapper[4781]: I1208 20:22:17.967661 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 08 20:22:19 crc kubenswrapper[4781]: I1208 20:22:19.184298 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 08 20:22:19 crc kubenswrapper[4781]: I1208 20:22:19.184383 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 08 20:22:19 crc kubenswrapper[4781]: I1208 20:22:19.710875 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 08 20:22:20 crc kubenswrapper[4781]: I1208 20:22:20.298260 4781 generic.go:334] "Generic (PLEG): container finished" podID="c65c4faf-ee79-4c01-98fc-d7ad607e6da4" containerID="4a162a04dbe9dd0496f492379fe26a584344d9c1d070046fed76595e93c4543a" exitCode=0 Dec 08 20:22:20 crc kubenswrapper[4781]: I1208 20:22:20.298577 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" event={"ID":"c65c4faf-ee79-4c01-98fc-d7ad607e6da4","Type":"ContainerDied","Data":"4a162a04dbe9dd0496f492379fe26a584344d9c1d070046fed76595e93c4543a"} Dec 08 20:22:20 crc kubenswrapper[4781]: I1208 20:22:20.300250 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9192ae66-92ec-4618-aecd-3ec306da8525","Type":"ContainerStarted","Data":"53cd4c4a695cb9ad9bb144e86dd7dccccf8ed41ca0427fc0f9f19b84cf5df558"} Dec 08 20:22:20 crc kubenswrapper[4781]: I1208 20:22:20.302125 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"202f9454-de0b-4a09-abb6-dacbea9b5fa4","Type":"ContainerStarted","Data":"3f89ebe6b8f5c4b9493038b512e65d98b842d3b16dc59eb207539df3c792e9d9"} Dec 08 20:22:20 crc kubenswrapper[4781]: I1208 20:22:20.304165 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" event={"ID":"739c8ff6-4b97-4e48-8418-a06815318dc5","Type":"ContainerStarted","Data":"2b838618dd8ab1f707ac9a204c76f3665f441eabf6c357f43bd02b3b105896dc"} Dec 08 20:22:20 crc kubenswrapper[4781]: I1208 20:22:20.304676 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:20 crc kubenswrapper[4781]: I1208 20:22:20.307152 4781 generic.go:334] "Generic (PLEG): container finished" podID="b675bf96-ecb2-4098-891f-6a87e0ed5140" containerID="f5c0c640a4f44ca8fc6b97d34e74adc53a597a4a98dd6bf916bc0bd40651d921" exitCode=0 Dec 08 20:22:20 crc kubenswrapper[4781]: I1208 20:22:20.307199 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9cv7c" event={"ID":"b675bf96-ecb2-4098-891f-6a87e0ed5140","Type":"ContainerDied","Data":"f5c0c640a4f44ca8fc6b97d34e74adc53a597a4a98dd6bf916bc0bd40651d921"} Dec 08 20:22:20 crc kubenswrapper[4781]: I1208 20:22:20.420487 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" podStartSLOduration=11.858652913 podStartE2EDuration="12.420469217s" podCreationTimestamp="2025-12-08 20:22:08 +0000 UTC" firstStartedPulling="2025-12-08 20:22:12.696715235 +0000 UTC m=+1048.847998612" lastFinishedPulling="2025-12-08 20:22:13.258531539 +0000 UTC m=+1049.409814916" observedRunningTime="2025-12-08 20:22:20.418077968 +0000 UTC m=+1056.569361345" watchObservedRunningTime="2025-12-08 20:22:20.420469217 +0000 UTC m=+1056.571752594" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.269457 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.282013 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.323721 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-chtjj"] Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.368375 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-pfdb5"] Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.375458 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.384700 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9cv7c" event={"ID":"b675bf96-ecb2-4098-891f-6a87e0ed5140","Type":"ContainerStarted","Data":"164ea9bebed590f70f47355f078f068e3ca518614beb32b6c6dadf58f127fa68"} Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.384758 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9cv7c" event={"ID":"b675bf96-ecb2-4098-891f-6a87e0ed5140","Type":"ContainerStarted","Data":"fec41c065fb3118d3c70bdc5c51763c6b88153b5a369d79df929ec4b63e692a6"} Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.385083 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.385118 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.385782 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-pfdb5"] Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.404436 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" event={"ID":"c65c4faf-ee79-4c01-98fc-d7ad607e6da4","Type":"ContainerStarted","Data":"0f19e50fe3efd2aea838d52251cbe2b173e49e19f53cf7a037ec5cecc72bbf4b"} Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.404510 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.470227 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-9cv7c" podStartSLOduration=30.401397992 podStartE2EDuration="37.470207355s" podCreationTimestamp="2025-12-08 20:21:44 +0000 UTC" firstStartedPulling="2025-12-08 20:22:05.164276067 +0000 UTC m=+1041.315559444" lastFinishedPulling="2025-12-08 20:22:12.23308543 +0000 UTC m=+1048.384368807" observedRunningTime="2025-12-08 20:22:21.459430385 +0000 UTC m=+1057.610713762" watchObservedRunningTime="2025-12-08 20:22:21.470207355 +0000 UTC m=+1057.621490732" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.486546 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" podStartSLOduration=8.262486431 podStartE2EDuration="14.486519803s" podCreationTimestamp="2025-12-08 20:22:07 +0000 UTC" firstStartedPulling="2025-12-08 20:22:13.214719931 +0000 UTC m=+1049.366003308" lastFinishedPulling="2025-12-08 20:22:19.438753303 +0000 UTC m=+1055.590036680" observedRunningTime="2025-12-08 20:22:21.485999128 +0000 UTC m=+1057.637282505" watchObservedRunningTime="2025-12-08 20:22:21.486519803 +0000 UTC m=+1057.637803180" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.487633 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg7j8\" (UniqueName: \"kubernetes.io/projected/269342fb-c8ab-4f10-8b53-e49c89d73afa-kube-api-access-bg7j8\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.487690 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.487764 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.487999 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-dns-svc\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.488028 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-config\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.596157 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-dns-svc\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.596223 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-config\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.596296 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg7j8\" (UniqueName: \"kubernetes.io/projected/269342fb-c8ab-4f10-8b53-e49c89d73afa-kube-api-access-bg7j8\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.596321 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.596383 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.597423 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.599684 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-config\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.600354 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.600654 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-dns-svc\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.649894 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg7j8\" (UniqueName: \"kubernetes.io/projected/269342fb-c8ab-4f10-8b53-e49c89d73afa-kube-api-access-bg7j8\") pod \"dnsmasq-dns-59d5fbdd8c-pfdb5\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:21 crc kubenswrapper[4781]: I1208 20:22:21.707346 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.221000 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.302616 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.461029 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" podUID="c65c4faf-ee79-4c01-98fc-d7ad607e6da4" containerName="dnsmasq-dns" containerID="cri-o://0f19e50fe3efd2aea838d52251cbe2b173e49e19f53cf7a037ec5cecc72bbf4b" gracePeriod=10 Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.556067 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.570235 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.574375 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.574644 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.575065 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-47cpj" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.576492 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.606661 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.715003 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-cdbxl"] Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.717323 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.721226 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.721252 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.721361 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.726724 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cdbxl"] Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.755515 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.755617 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/75ab4f11-7508-4813-83bd-05ef029af585-lock\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.755641 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f6vt\" (UniqueName: \"kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-kube-api-access-4f6vt\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.755667 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/75ab4f11-7508-4813-83bd-05ef029af585-cache\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.755706 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.857406 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f6vt\" (UniqueName: \"kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-kube-api-access-4f6vt\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.857452 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b7hs\" (UniqueName: \"kubernetes.io/projected/ec594735-a472-4c13-b98b-453a80fceb1d-kube-api-access-8b7hs\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.857483 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec594735-a472-4c13-b98b-453a80fceb1d-etc-swift\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.857510 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/75ab4f11-7508-4813-83bd-05ef029af585-cache\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.857542 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec594735-a472-4c13-b98b-453a80fceb1d-scripts\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.857564 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-combined-ca-bundle\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.857580 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.857614 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec594735-a472-4c13-b98b-453a80fceb1d-ring-data-devices\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.857637 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-swiftconf\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.857667 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.857711 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-dispersionconf\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.857734 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/75ab4f11-7508-4813-83bd-05ef029af585-lock\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.858135 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/75ab4f11-7508-4813-83bd-05ef029af585-lock\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.858579 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/75ab4f11-7508-4813-83bd-05ef029af585-cache\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: E1208 20:22:22.858680 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 08 20:22:22 crc kubenswrapper[4781]: E1208 20:22:22.858696 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 08 20:22:22 crc kubenswrapper[4781]: E1208 20:22:22.858729 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift podName:75ab4f11-7508-4813-83bd-05ef029af585 nodeName:}" failed. No retries permitted until 2025-12-08 20:22:23.358715203 +0000 UTC m=+1059.509998580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift") pod "swift-storage-0" (UID: "75ab4f11-7508-4813-83bd-05ef029af585") : configmap "swift-ring-files" not found Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.859062 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.936444 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f6vt\" (UniqueName: \"kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-kube-api-access-4f6vt\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.938367 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.959636 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b7hs\" (UniqueName: \"kubernetes.io/projected/ec594735-a472-4c13-b98b-453a80fceb1d-kube-api-access-8b7hs\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.959934 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec594735-a472-4c13-b98b-453a80fceb1d-etc-swift\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.960096 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec594735-a472-4c13-b98b-453a80fceb1d-scripts\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.960228 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-combined-ca-bundle\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.960396 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec594735-a472-4c13-b98b-453a80fceb1d-ring-data-devices\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.960532 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-swiftconf\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.960730 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-dispersionconf\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.961118 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec594735-a472-4c13-b98b-453a80fceb1d-ring-data-devices\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.961138 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec594735-a472-4c13-b98b-453a80fceb1d-scripts\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.961469 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec594735-a472-4c13-b98b-453a80fceb1d-etc-swift\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.965505 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-dispersionconf\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.965586 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-swiftconf\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.965787 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-combined-ca-bundle\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:22 crc kubenswrapper[4781]: I1208 20:22:22.978528 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b7hs\" (UniqueName: \"kubernetes.io/projected/ec594735-a472-4c13-b98b-453a80fceb1d-kube-api-access-8b7hs\") pod \"swift-ring-rebalance-cdbxl\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:23 crc kubenswrapper[4781]: I1208 20:22:23.099716 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:23 crc kubenswrapper[4781]: I1208 20:22:23.367667 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:23 crc kubenswrapper[4781]: E1208 20:22:23.367908 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 08 20:22:23 crc kubenswrapper[4781]: E1208 20:22:23.367958 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 08 20:22:23 crc kubenswrapper[4781]: E1208 20:22:23.368023 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift podName:75ab4f11-7508-4813-83bd-05ef029af585 nodeName:}" failed. No retries permitted until 2025-12-08 20:22:24.368006449 +0000 UTC m=+1060.519289816 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift") pod "swift-storage-0" (UID: "75ab4f11-7508-4813-83bd-05ef029af585") : configmap "swift-ring-files" not found Dec 08 20:22:23 crc kubenswrapper[4781]: I1208 20:22:23.468322 4781 generic.go:334] "Generic (PLEG): container finished" podID="c65c4faf-ee79-4c01-98fc-d7ad607e6da4" containerID="0f19e50fe3efd2aea838d52251cbe2b173e49e19f53cf7a037ec5cecc72bbf4b" exitCode=0 Dec 08 20:22:23 crc kubenswrapper[4781]: I1208 20:22:23.468364 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" event={"ID":"c65c4faf-ee79-4c01-98fc-d7ad607e6da4","Type":"ContainerDied","Data":"0f19e50fe3efd2aea838d52251cbe2b173e49e19f53cf7a037ec5cecc72bbf4b"} Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.088806 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.187834 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-config\") pod \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.187996 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-ovsdbserver-nb\") pod \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.188034 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-dns-svc\") pod \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.188064 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bbt2\" (UniqueName: \"kubernetes.io/projected/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-kube-api-access-9bbt2\") pod \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\" (UID: \"c65c4faf-ee79-4c01-98fc-d7ad607e6da4\") " Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.195144 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-kube-api-access-9bbt2" (OuterVolumeSpecName: "kube-api-access-9bbt2") pod "c65c4faf-ee79-4c01-98fc-d7ad607e6da4" (UID: "c65c4faf-ee79-4c01-98fc-d7ad607e6da4"). InnerVolumeSpecName "kube-api-access-9bbt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.228704 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c65c4faf-ee79-4c01-98fc-d7ad607e6da4" (UID: "c65c4faf-ee79-4c01-98fc-d7ad607e6da4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.230642 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-config" (OuterVolumeSpecName: "config") pod "c65c4faf-ee79-4c01-98fc-d7ad607e6da4" (UID: "c65c4faf-ee79-4c01-98fc-d7ad607e6da4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.245567 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c65c4faf-ee79-4c01-98fc-d7ad607e6da4" (UID: "c65c4faf-ee79-4c01-98fc-d7ad607e6da4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.292193 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.292226 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.292240 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.292251 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bbt2\" (UniqueName: \"kubernetes.io/projected/c65c4faf-ee79-4c01-98fc-d7ad607e6da4-kube-api-access-9bbt2\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.393754 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:24 crc kubenswrapper[4781]: E1208 20:22:24.393968 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 08 20:22:24 crc kubenswrapper[4781]: E1208 20:22:24.393995 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 08 20:22:24 crc kubenswrapper[4781]: E1208 20:22:24.394056 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift podName:75ab4f11-7508-4813-83bd-05ef029af585 nodeName:}" failed. No retries permitted until 2025-12-08 20:22:26.394037836 +0000 UTC m=+1062.545321213 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift") pod "swift-storage-0" (UID: "75ab4f11-7508-4813-83bd-05ef029af585") : configmap "swift-ring-files" not found Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.476636 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" event={"ID":"c65c4faf-ee79-4c01-98fc-d7ad607e6da4","Type":"ContainerDied","Data":"5c0639cbc73e8225f3d81e37d249728e8aeaa348a609577cc5f0d8e5e07b1605"} Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.476684 4781 scope.go:117] "RemoveContainer" containerID="0f19e50fe3efd2aea838d52251cbe2b173e49e19f53cf7a037ec5cecc72bbf4b" Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.476800 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db9b5bc9-chtjj" Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.545783 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-chtjj"] Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.546670 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-chtjj"] Dec 08 20:22:24 crc kubenswrapper[4781]: I1208 20:22:24.556996 4781 scope.go:117] "RemoveContainer" containerID="4a162a04dbe9dd0496f492379fe26a584344d9c1d070046fed76595e93c4543a" Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.069545 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-cdbxl"] Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.172536 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-pfdb5"] Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.316671 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.390204 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.489339 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cdbxl" event={"ID":"ec594735-a472-4c13-b98b-453a80fceb1d","Type":"ContainerStarted","Data":"a96aaf6be3a269be48e2693de169a6c47e1eefc3b1dc7842a15028e30acb678f"} Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.491424 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"5544d7c5-67c2-4f2e-9e0f-d8307d831d5d","Type":"ContainerStarted","Data":"9845057ebb6760fafcd75c4341ecc37bdccba5d24030d2d1842e4279a33e0b98"} Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.493005 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9w8ql" event={"ID":"0d8cf2d7-e85f-49e8-95e2-c1548f506888","Type":"ContainerStarted","Data":"5d92f2ffb150673f133cef4f2310b9e112e756f8315f875f8721d73204a8a84e"} Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.495244 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3530fc96-f407-470c-a960-c7cfd844c517","Type":"ContainerStarted","Data":"13ae0c8fe6c716e58a5700641427b6dd307ae8c2d0a073d8282d9b380268549a"} Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.496629 4781 generic.go:334] "Generic (PLEG): container finished" podID="269342fb-c8ab-4f10-8b53-e49c89d73afa" containerID="ba349445ac6dbf4e0bc38cdfc7d92a0f902501ad627806cb431e9cc014144933" exitCode=0 Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.496680 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" event={"ID":"269342fb-c8ab-4f10-8b53-e49c89d73afa","Type":"ContainerDied","Data":"ba349445ac6dbf4e0bc38cdfc7d92a0f902501ad627806cb431e9cc014144933"} Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.496702 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" event={"ID":"269342fb-c8ab-4f10-8b53-e49c89d73afa","Type":"ContainerStarted","Data":"bd7a4d6836dbdadb83e3d9b60997ec6ff98877aae1177c7068103735c6b2e9e6"} Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.519243 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=20.800173859 podStartE2EDuration="39.519219001s" podCreationTimestamp="2025-12-08 20:21:46 +0000 UTC" firstStartedPulling="2025-12-08 20:22:05.733148354 +0000 UTC m=+1041.884431731" lastFinishedPulling="2025-12-08 20:22:24.452193496 +0000 UTC m=+1060.603476873" observedRunningTime="2025-12-08 20:22:25.5087482 +0000 UTC m=+1061.660031577" watchObservedRunningTime="2025-12-08 20:22:25.519219001 +0000 UTC m=+1061.670502398" Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.522100 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.528163 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=18.827009282 podStartE2EDuration="41.528147047s" podCreationTimestamp="2025-12-08 20:21:44 +0000 UTC" firstStartedPulling="2025-12-08 20:22:01.748734954 +0000 UTC m=+1037.900018331" lastFinishedPulling="2025-12-08 20:22:24.449872719 +0000 UTC m=+1060.601156096" observedRunningTime="2025-12-08 20:22:25.52791287 +0000 UTC m=+1061.679196247" watchObservedRunningTime="2025-12-08 20:22:25.528147047 +0000 UTC m=+1061.679430424" Dec 08 20:22:25 crc kubenswrapper[4781]: I1208 20:22:25.543866 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-9w8ql" podStartSLOduration=7.225999165 podStartE2EDuration="18.543847558s" podCreationTimestamp="2025-12-08 20:22:07 +0000 UTC" firstStartedPulling="2025-12-08 20:22:13.146963785 +0000 UTC m=+1049.298247162" lastFinishedPulling="2025-12-08 20:22:24.464812178 +0000 UTC m=+1060.616095555" observedRunningTime="2025-12-08 20:22:25.542985863 +0000 UTC m=+1061.694269240" watchObservedRunningTime="2025-12-08 20:22:25.543847558 +0000 UTC m=+1061.695130935" Dec 08 20:22:26 crc kubenswrapper[4781]: I1208 20:22:26.137595 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c65c4faf-ee79-4c01-98fc-d7ad607e6da4" path="/var/lib/kubelet/pods/c65c4faf-ee79-4c01-98fc-d7ad607e6da4/volumes" Dec 08 20:22:26 crc kubenswrapper[4781]: I1208 20:22:26.429221 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:26 crc kubenswrapper[4781]: E1208 20:22:26.429378 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 08 20:22:26 crc kubenswrapper[4781]: E1208 20:22:26.429655 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 08 20:22:26 crc kubenswrapper[4781]: E1208 20:22:26.429715 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift podName:75ab4f11-7508-4813-83bd-05ef029af585 nodeName:}" failed. No retries permitted until 2025-12-08 20:22:30.4296982 +0000 UTC m=+1066.580981577 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift") pod "swift-storage-0" (UID: "75ab4f11-7508-4813-83bd-05ef029af585") : configmap "swift-ring-files" not found Dec 08 20:22:26 crc kubenswrapper[4781]: I1208 20:22:26.507177 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" event={"ID":"269342fb-c8ab-4f10-8b53-e49c89d73afa","Type":"ContainerStarted","Data":"a0db2c0721bf37b3c44650542a81d1fa594ce2e235bc2ed1eed7d31917f5b6a2"} Dec 08 20:22:26 crc kubenswrapper[4781]: I1208 20:22:26.532268 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" podStartSLOduration=5.532250875 podStartE2EDuration="5.532250875s" podCreationTimestamp="2025-12-08 20:22:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:22:26.525574443 +0000 UTC m=+1062.676857850" watchObservedRunningTime="2025-12-08 20:22:26.532250875 +0000 UTC m=+1062.683534252" Dec 08 20:22:26 crc kubenswrapper[4781]: I1208 20:22:26.708213 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:26 crc kubenswrapper[4781]: I1208 20:22:26.715642 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 08 20:22:26 crc kubenswrapper[4781]: I1208 20:22:26.764464 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 08 20:22:27 crc kubenswrapper[4781]: I1208 20:22:27.512621 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 08 20:22:27 crc kubenswrapper[4781]: I1208 20:22:27.521700 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 08 20:22:27 crc kubenswrapper[4781]: I1208 20:22:27.560189 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 08 20:22:27 crc kubenswrapper[4781]: I1208 20:22:27.566424 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.561770 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.768559 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 08 20:22:28 crc kubenswrapper[4781]: E1208 20:22:28.769685 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65c4faf-ee79-4c01-98fc-d7ad607e6da4" containerName="init" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.769713 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65c4faf-ee79-4c01-98fc-d7ad607e6da4" containerName="init" Dec 08 20:22:28 crc kubenswrapper[4781]: E1208 20:22:28.769737 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65c4faf-ee79-4c01-98fc-d7ad607e6da4" containerName="dnsmasq-dns" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.769746 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65c4faf-ee79-4c01-98fc-d7ad607e6da4" containerName="dnsmasq-dns" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.770428 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65c4faf-ee79-4c01-98fc-d7ad607e6da4" containerName="dnsmasq-dns" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.772424 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.773085 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.776700 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.777391 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.779093 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.779842 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-slh55" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.787428 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.885990 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.886054 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.886076 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.886102 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-config\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.886131 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gmhx\" (UniqueName: \"kubernetes.io/projected/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-kube-api-access-4gmhx\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.886170 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-scripts\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.886188 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.988832 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.988906 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-config\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.988977 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gmhx\" (UniqueName: \"kubernetes.io/projected/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-kube-api-access-4gmhx\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.989391 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-scripts\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.989448 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.989955 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.990015 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.990057 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-scripts\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.990465 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-config\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.991145 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.994189 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.994704 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:28 crc kubenswrapper[4781]: I1208 20:22:28.995139 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.020410 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gmhx\" (UniqueName: \"kubernetes.io/projected/a1bbb8af-58b4-4eff-9e81-5206ecc06b2e-kube-api-access-4gmhx\") pod \"ovn-northd-0\" (UID: \"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e\") " pod="openstack/ovn-northd-0" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.123854 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.423839 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-vhmq2"] Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.425431 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhmq2" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.446260 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vhmq2"] Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.498156 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl6qv\" (UniqueName: \"kubernetes.io/projected/d3fd2e65-3874-4123-98d0-38e820feb05d-kube-api-access-wl6qv\") pod \"keystone-db-create-vhmq2\" (UID: \"d3fd2e65-3874-4123-98d0-38e820feb05d\") " pod="openstack/keystone-db-create-vhmq2" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.498243 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3fd2e65-3874-4123-98d0-38e820feb05d-operator-scripts\") pod \"keystone-db-create-vhmq2\" (UID: \"d3fd2e65-3874-4123-98d0-38e820feb05d\") " pod="openstack/keystone-db-create-vhmq2" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.528801 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cdbxl" event={"ID":"ec594735-a472-4c13-b98b-453a80fceb1d","Type":"ContainerStarted","Data":"673f87594272412dd3b15b2bf474616751b144ec8cac22b7e72a51db69d7cd49"} Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.548083 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-cdbxl" podStartSLOduration=4.012493407 podStartE2EDuration="7.548065407s" podCreationTimestamp="2025-12-08 20:22:22 +0000 UTC" firstStartedPulling="2025-12-08 20:22:25.077204366 +0000 UTC m=+1061.228487743" lastFinishedPulling="2025-12-08 20:22:28.612776356 +0000 UTC m=+1064.764059743" observedRunningTime="2025-12-08 20:22:29.546208884 +0000 UTC m=+1065.697492261" watchObservedRunningTime="2025-12-08 20:22:29.548065407 +0000 UTC m=+1065.699348784" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.597434 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.599310 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl6qv\" (UniqueName: \"kubernetes.io/projected/d3fd2e65-3874-4123-98d0-38e820feb05d-kube-api-access-wl6qv\") pod \"keystone-db-create-vhmq2\" (UID: \"d3fd2e65-3874-4123-98d0-38e820feb05d\") " pod="openstack/keystone-db-create-vhmq2" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.599446 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3fd2e65-3874-4123-98d0-38e820feb05d-operator-scripts\") pod \"keystone-db-create-vhmq2\" (UID: \"d3fd2e65-3874-4123-98d0-38e820feb05d\") " pod="openstack/keystone-db-create-vhmq2" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.600620 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3fd2e65-3874-4123-98d0-38e820feb05d-operator-scripts\") pod \"keystone-db-create-vhmq2\" (UID: \"d3fd2e65-3874-4123-98d0-38e820feb05d\") " pod="openstack/keystone-db-create-vhmq2" Dec 08 20:22:29 crc kubenswrapper[4781]: W1208 20:22:29.601067 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1bbb8af_58b4_4eff_9e81_5206ecc06b2e.slice/crio-330ddb58a7284e17a4eb029ab9818b552e7bcd961944a6d1b4be650fb260338a WatchSource:0}: Error finding container 330ddb58a7284e17a4eb029ab9818b552e7bcd961944a6d1b4be650fb260338a: Status 404 returned error can't find the container with id 330ddb58a7284e17a4eb029ab9818b552e7bcd961944a6d1b4be650fb260338a Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.618116 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl6qv\" (UniqueName: \"kubernetes.io/projected/d3fd2e65-3874-4123-98d0-38e820feb05d-kube-api-access-wl6qv\") pod \"keystone-db-create-vhmq2\" (UID: \"d3fd2e65-3874-4123-98d0-38e820feb05d\") " pod="openstack/keystone-db-create-vhmq2" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.685742 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5ece-account-create-update-q2l9w"] Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.686694 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ece-account-create-update-q2l9w" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.689524 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.693873 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5ece-account-create-update-q2l9w"] Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.701649 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f128781-d8dc-4990-a2b4-bbe58950f8c4-operator-scripts\") pod \"keystone-5ece-account-create-update-q2l9w\" (UID: \"2f128781-d8dc-4990-a2b4-bbe58950f8c4\") " pod="openstack/keystone-5ece-account-create-update-q2l9w" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.701693 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxxtq\" (UniqueName: \"kubernetes.io/projected/2f128781-d8dc-4990-a2b4-bbe58950f8c4-kube-api-access-dxxtq\") pod \"keystone-5ece-account-create-update-q2l9w\" (UID: \"2f128781-d8dc-4990-a2b4-bbe58950f8c4\") " pod="openstack/keystone-5ece-account-create-update-q2l9w" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.742576 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhmq2" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.785394 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-trdlp"] Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.786343 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-trdlp" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.803080 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-trdlp"] Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.803966 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c3df1d-d4ef-4f84-96a8-a4b6b2345844-operator-scripts\") pod \"placement-db-create-trdlp\" (UID: \"b2c3df1d-d4ef-4f84-96a8-a4b6b2345844\") " pod="openstack/placement-db-create-trdlp" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.804190 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f128781-d8dc-4990-a2b4-bbe58950f8c4-operator-scripts\") pod \"keystone-5ece-account-create-update-q2l9w\" (UID: \"2f128781-d8dc-4990-a2b4-bbe58950f8c4\") " pod="openstack/keystone-5ece-account-create-update-q2l9w" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.804217 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxxtq\" (UniqueName: \"kubernetes.io/projected/2f128781-d8dc-4990-a2b4-bbe58950f8c4-kube-api-access-dxxtq\") pod \"keystone-5ece-account-create-update-q2l9w\" (UID: \"2f128781-d8dc-4990-a2b4-bbe58950f8c4\") " pod="openstack/keystone-5ece-account-create-update-q2l9w" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.804375 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt27r\" (UniqueName: \"kubernetes.io/projected/b2c3df1d-d4ef-4f84-96a8-a4b6b2345844-kube-api-access-nt27r\") pod \"placement-db-create-trdlp\" (UID: \"b2c3df1d-d4ef-4f84-96a8-a4b6b2345844\") " pod="openstack/placement-db-create-trdlp" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.805271 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f128781-d8dc-4990-a2b4-bbe58950f8c4-operator-scripts\") pod \"keystone-5ece-account-create-update-q2l9w\" (UID: \"2f128781-d8dc-4990-a2b4-bbe58950f8c4\") " pod="openstack/keystone-5ece-account-create-update-q2l9w" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.824409 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxxtq\" (UniqueName: \"kubernetes.io/projected/2f128781-d8dc-4990-a2b4-bbe58950f8c4-kube-api-access-dxxtq\") pod \"keystone-5ece-account-create-update-q2l9w\" (UID: \"2f128781-d8dc-4990-a2b4-bbe58950f8c4\") " pod="openstack/keystone-5ece-account-create-update-q2l9w" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.887676 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-e0fb-account-create-update-fwgdh"] Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.897106 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e0fb-account-create-update-fwgdh" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.899103 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.905166 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e0fb-account-create-update-fwgdh"] Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.905879 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d9b08ae-f5f5-430b-9c0c-f54aafa4261e-operator-scripts\") pod \"placement-e0fb-account-create-update-fwgdh\" (UID: \"8d9b08ae-f5f5-430b-9c0c-f54aafa4261e\") " pod="openstack/placement-e0fb-account-create-update-fwgdh" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.905966 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6r5l\" (UniqueName: \"kubernetes.io/projected/8d9b08ae-f5f5-430b-9c0c-f54aafa4261e-kube-api-access-t6r5l\") pod \"placement-e0fb-account-create-update-fwgdh\" (UID: \"8d9b08ae-f5f5-430b-9c0c-f54aafa4261e\") " pod="openstack/placement-e0fb-account-create-update-fwgdh" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.906011 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt27r\" (UniqueName: \"kubernetes.io/projected/b2c3df1d-d4ef-4f84-96a8-a4b6b2345844-kube-api-access-nt27r\") pod \"placement-db-create-trdlp\" (UID: \"b2c3df1d-d4ef-4f84-96a8-a4b6b2345844\") " pod="openstack/placement-db-create-trdlp" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.906045 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c3df1d-d4ef-4f84-96a8-a4b6b2345844-operator-scripts\") pod \"placement-db-create-trdlp\" (UID: \"b2c3df1d-d4ef-4f84-96a8-a4b6b2345844\") " pod="openstack/placement-db-create-trdlp" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.906995 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c3df1d-d4ef-4f84-96a8-a4b6b2345844-operator-scripts\") pod \"placement-db-create-trdlp\" (UID: \"b2c3df1d-d4ef-4f84-96a8-a4b6b2345844\") " pod="openstack/placement-db-create-trdlp" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.922257 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt27r\" (UniqueName: \"kubernetes.io/projected/b2c3df1d-d4ef-4f84-96a8-a4b6b2345844-kube-api-access-nt27r\") pod \"placement-db-create-trdlp\" (UID: \"b2c3df1d-d4ef-4f84-96a8-a4b6b2345844\") " pod="openstack/placement-db-create-trdlp" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.948268 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.948340 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.948393 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.949201 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25df9dbfafc8a5164a8f6020132a91b9381bc39dadc7e73659feabffa41e871a"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.949267 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://25df9dbfafc8a5164a8f6020132a91b9381bc39dadc7e73659feabffa41e871a" gracePeriod=600 Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.991332 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-zxgkj"] Dec 08 20:22:29 crc kubenswrapper[4781]: I1208 20:22:29.992762 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zxgkj" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.006132 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ece-account-create-update-q2l9w" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.007449 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8p52\" (UniqueName: \"kubernetes.io/projected/bf42e858-785c-484b-ab72-cddeddbdd145-kube-api-access-p8p52\") pod \"glance-db-create-zxgkj\" (UID: \"bf42e858-785c-484b-ab72-cddeddbdd145\") " pod="openstack/glance-db-create-zxgkj" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.007588 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d9b08ae-f5f5-430b-9c0c-f54aafa4261e-operator-scripts\") pod \"placement-e0fb-account-create-update-fwgdh\" (UID: \"8d9b08ae-f5f5-430b-9c0c-f54aafa4261e\") " pod="openstack/placement-e0fb-account-create-update-fwgdh" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.007632 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf42e858-785c-484b-ab72-cddeddbdd145-operator-scripts\") pod \"glance-db-create-zxgkj\" (UID: \"bf42e858-785c-484b-ab72-cddeddbdd145\") " pod="openstack/glance-db-create-zxgkj" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.007659 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6r5l\" (UniqueName: \"kubernetes.io/projected/8d9b08ae-f5f5-430b-9c0c-f54aafa4261e-kube-api-access-t6r5l\") pod \"placement-e0fb-account-create-update-fwgdh\" (UID: \"8d9b08ae-f5f5-430b-9c0c-f54aafa4261e\") " pod="openstack/placement-e0fb-account-create-update-fwgdh" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.008344 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d9b08ae-f5f5-430b-9c0c-f54aafa4261e-operator-scripts\") pod \"placement-e0fb-account-create-update-fwgdh\" (UID: \"8d9b08ae-f5f5-430b-9c0c-f54aafa4261e\") " pod="openstack/placement-e0fb-account-create-update-fwgdh" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.009967 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zxgkj"] Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.026242 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6r5l\" (UniqueName: \"kubernetes.io/projected/8d9b08ae-f5f5-430b-9c0c-f54aafa4261e-kube-api-access-t6r5l\") pod \"placement-e0fb-account-create-update-fwgdh\" (UID: \"8d9b08ae-f5f5-430b-9c0c-f54aafa4261e\") " pod="openstack/placement-e0fb-account-create-update-fwgdh" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.098292 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b342-account-create-update-b5hpz"] Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.099345 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b342-account-create-update-b5hpz" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.102005 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.104240 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b342-account-create-update-b5hpz"] Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.111410 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90921e9-885e-436c-833a-8f02c075c898-operator-scripts\") pod \"glance-b342-account-create-update-b5hpz\" (UID: \"c90921e9-885e-436c-833a-8f02c075c898\") " pod="openstack/glance-b342-account-create-update-b5hpz" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.111503 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf42e858-785c-484b-ab72-cddeddbdd145-operator-scripts\") pod \"glance-db-create-zxgkj\" (UID: \"bf42e858-785c-484b-ab72-cddeddbdd145\") " pod="openstack/glance-db-create-zxgkj" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.111579 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8p52\" (UniqueName: \"kubernetes.io/projected/bf42e858-785c-484b-ab72-cddeddbdd145-kube-api-access-p8p52\") pod \"glance-db-create-zxgkj\" (UID: \"bf42e858-785c-484b-ab72-cddeddbdd145\") " pod="openstack/glance-db-create-zxgkj" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.111619 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbgjk\" (UniqueName: \"kubernetes.io/projected/c90921e9-885e-436c-833a-8f02c075c898-kube-api-access-hbgjk\") pod \"glance-b342-account-create-update-b5hpz\" (UID: \"c90921e9-885e-436c-833a-8f02c075c898\") " pod="openstack/glance-b342-account-create-update-b5hpz" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.115461 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf42e858-785c-484b-ab72-cddeddbdd145-operator-scripts\") pod \"glance-db-create-zxgkj\" (UID: \"bf42e858-785c-484b-ab72-cddeddbdd145\") " pod="openstack/glance-db-create-zxgkj" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.131674 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8p52\" (UniqueName: \"kubernetes.io/projected/bf42e858-785c-484b-ab72-cddeddbdd145-kube-api-access-p8p52\") pod \"glance-db-create-zxgkj\" (UID: \"bf42e858-785c-484b-ab72-cddeddbdd145\") " pod="openstack/glance-db-create-zxgkj" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.215133 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbgjk\" (UniqueName: \"kubernetes.io/projected/c90921e9-885e-436c-833a-8f02c075c898-kube-api-access-hbgjk\") pod \"glance-b342-account-create-update-b5hpz\" (UID: \"c90921e9-885e-436c-833a-8f02c075c898\") " pod="openstack/glance-b342-account-create-update-b5hpz" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.215508 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90921e9-885e-436c-833a-8f02c075c898-operator-scripts\") pod \"glance-b342-account-create-update-b5hpz\" (UID: \"c90921e9-885e-436c-833a-8f02c075c898\") " pod="openstack/glance-b342-account-create-update-b5hpz" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.216332 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90921e9-885e-436c-833a-8f02c075c898-operator-scripts\") pod \"glance-b342-account-create-update-b5hpz\" (UID: \"c90921e9-885e-436c-833a-8f02c075c898\") " pod="openstack/glance-b342-account-create-update-b5hpz" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.219502 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-trdlp" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.253998 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbgjk\" (UniqueName: \"kubernetes.io/projected/c90921e9-885e-436c-833a-8f02c075c898-kube-api-access-hbgjk\") pod \"glance-b342-account-create-update-b5hpz\" (UID: \"c90921e9-885e-436c-833a-8f02c075c898\") " pod="openstack/glance-b342-account-create-update-b5hpz" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.275999 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-vhmq2"] Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.280723 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e0fb-account-create-update-fwgdh" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.322882 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zxgkj" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.421477 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b342-account-create-update-b5hpz" Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.523179 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:30 crc kubenswrapper[4781]: E1208 20:22:30.523740 4781 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 08 20:22:30 crc kubenswrapper[4781]: E1208 20:22:30.523760 4781 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 08 20:22:30 crc kubenswrapper[4781]: E1208 20:22:30.523814 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift podName:75ab4f11-7508-4813-83bd-05ef029af585 nodeName:}" failed. No retries permitted until 2025-12-08 20:22:38.52379486 +0000 UTC m=+1074.675078237 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift") pod "swift-storage-0" (UID: "75ab4f11-7508-4813-83bd-05ef029af585") : configmap "swift-ring-files" not found Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.702243 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5ece-account-create-update-q2l9w"] Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.739486 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vhmq2" event={"ID":"d3fd2e65-3874-4123-98d0-38e820feb05d","Type":"ContainerStarted","Data":"4ae95672cfd87cb81dbcf8dbafbbbd5bb7f4221f18ad1f70a0b672638496e427"} Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.746004 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e","Type":"ContainerStarted","Data":"330ddb58a7284e17a4eb029ab9818b552e7bcd961944a6d1b4be650fb260338a"} Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.807399 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="25df9dbfafc8a5164a8f6020132a91b9381bc39dadc7e73659feabffa41e871a" exitCode=0 Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.808423 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"25df9dbfafc8a5164a8f6020132a91b9381bc39dadc7e73659feabffa41e871a"} Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.808449 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"4fea80c9d2853786513b0b8aceae77577c3f9f5cebb3cd832d508e012c04f4da"} Dec 08 20:22:30 crc kubenswrapper[4781]: I1208 20:22:30.808481 4781 scope.go:117] "RemoveContainer" containerID="771a2f271b567bb174dbc7c73044708946a8faab42d018d48f6894347bef1ff3" Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.079026 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-trdlp"] Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.289421 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-zxgkj"] Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.301568 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e0fb-account-create-update-fwgdh"] Dec 08 20:22:31 crc kubenswrapper[4781]: W1208 20:22:31.375275 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d9b08ae_f5f5_430b_9c0c_f54aafa4261e.slice/crio-a54a9032cd1eac7358e285528447404c63a333189acb9aacdc025e26345876df WatchSource:0}: Error finding container a54a9032cd1eac7358e285528447404c63a333189acb9aacdc025e26345876df: Status 404 returned error can't find the container with id a54a9032cd1eac7358e285528447404c63a333189acb9aacdc025e26345876df Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.518209 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b342-account-create-update-b5hpz"] Dec 08 20:22:31 crc kubenswrapper[4781]: W1208 20:22:31.573010 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc90921e9_885e_436c_833a_8f02c075c898.slice/crio-b2080d228c77d8102828b23663ff8cac408db3e880a18350f1a4b8b69d7b59b8 WatchSource:0}: Error finding container b2080d228c77d8102828b23663ff8cac408db3e880a18350f1a4b8b69d7b59b8: Status 404 returned error can't find the container with id b2080d228c77d8102828b23663ff8cac408db3e880a18350f1a4b8b69d7b59b8 Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.709103 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.770889 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-h7t2l"] Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.771748 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" podUID="739c8ff6-4b97-4e48-8418-a06815318dc5" containerName="dnsmasq-dns" containerID="cri-o://2b838618dd8ab1f707ac9a204c76f3665f441eabf6c357f43bd02b3b105896dc" gracePeriod=10 Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.827835 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e0fb-account-create-update-fwgdh" event={"ID":"8d9b08ae-f5f5-430b-9c0c-f54aafa4261e","Type":"ContainerStarted","Data":"a54a9032cd1eac7358e285528447404c63a333189acb9aacdc025e26345876df"} Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.831936 4781 generic.go:334] "Generic (PLEG): container finished" podID="2f128781-d8dc-4990-a2b4-bbe58950f8c4" containerID="3f4167cbfbffa5757a1ca63f27e83180dc6295488d9338052690c3acf214cc9a" exitCode=0 Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.832016 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5ece-account-create-update-q2l9w" event={"ID":"2f128781-d8dc-4990-a2b4-bbe58950f8c4","Type":"ContainerDied","Data":"3f4167cbfbffa5757a1ca63f27e83180dc6295488d9338052690c3acf214cc9a"} Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.832075 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5ece-account-create-update-q2l9w" event={"ID":"2f128781-d8dc-4990-a2b4-bbe58950f8c4","Type":"ContainerStarted","Data":"e6c3b25e181bb65d708309c855557590d8b85c74a087a0d666f573d4fa1eae66"} Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.834015 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b342-account-create-update-b5hpz" event={"ID":"c90921e9-885e-436c-833a-8f02c075c898","Type":"ContainerStarted","Data":"b2080d228c77d8102828b23663ff8cac408db3e880a18350f1a4b8b69d7b59b8"} Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.835907 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zxgkj" event={"ID":"bf42e858-785c-484b-ab72-cddeddbdd145","Type":"ContainerStarted","Data":"270a7cf099d2cb2aa5dd6593225b907b655a909eb63764e65a179b7d63419a93"} Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.837265 4781 generic.go:334] "Generic (PLEG): container finished" podID="d3fd2e65-3874-4123-98d0-38e820feb05d" containerID="0f132fa717169ed1cefed093dab0a382dee9183233cacfab5249ea8c16c8f53f" exitCode=0 Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.837307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vhmq2" event={"ID":"d3fd2e65-3874-4123-98d0-38e820feb05d","Type":"ContainerDied","Data":"0f132fa717169ed1cefed093dab0a382dee9183233cacfab5249ea8c16c8f53f"} Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.838677 4781 generic.go:334] "Generic (PLEG): container finished" podID="b2c3df1d-d4ef-4f84-96a8-a4b6b2345844" containerID="3a3decd880070a4dddfa0a7e460979a7dd8508a5cb06adfcafcfae3a1a54751f" exitCode=0 Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.838731 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-trdlp" event={"ID":"b2c3df1d-d4ef-4f84-96a8-a4b6b2345844","Type":"ContainerDied","Data":"3a3decd880070a4dddfa0a7e460979a7dd8508a5cb06adfcafcfae3a1a54751f"} Dec 08 20:22:31 crc kubenswrapper[4781]: I1208 20:22:31.838759 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-trdlp" event={"ID":"b2c3df1d-d4ef-4f84-96a8-a4b6b2345844","Type":"ContainerStarted","Data":"89d3de6f0c8ce0cbe2b56f4c01c80891a58cf5a75b7c69e6d7616cb90697cd01"} Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.285656 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.467871 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-ovsdbserver-sb\") pod \"739c8ff6-4b97-4e48-8418-a06815318dc5\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.467981 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kcnl\" (UniqueName: \"kubernetes.io/projected/739c8ff6-4b97-4e48-8418-a06815318dc5-kube-api-access-8kcnl\") pod \"739c8ff6-4b97-4e48-8418-a06815318dc5\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.468103 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-config\") pod \"739c8ff6-4b97-4e48-8418-a06815318dc5\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.468165 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-dns-svc\") pod \"739c8ff6-4b97-4e48-8418-a06815318dc5\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.468188 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-ovsdbserver-nb\") pod \"739c8ff6-4b97-4e48-8418-a06815318dc5\" (UID: \"739c8ff6-4b97-4e48-8418-a06815318dc5\") " Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.472650 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739c8ff6-4b97-4e48-8418-a06815318dc5-kube-api-access-8kcnl" (OuterVolumeSpecName: "kube-api-access-8kcnl") pod "739c8ff6-4b97-4e48-8418-a06815318dc5" (UID: "739c8ff6-4b97-4e48-8418-a06815318dc5"). InnerVolumeSpecName "kube-api-access-8kcnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.521629 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "739c8ff6-4b97-4e48-8418-a06815318dc5" (UID: "739c8ff6-4b97-4e48-8418-a06815318dc5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.521692 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "739c8ff6-4b97-4e48-8418-a06815318dc5" (UID: "739c8ff6-4b97-4e48-8418-a06815318dc5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.529705 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-config" (OuterVolumeSpecName: "config") pod "739c8ff6-4b97-4e48-8418-a06815318dc5" (UID: "739c8ff6-4b97-4e48-8418-a06815318dc5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.530114 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "739c8ff6-4b97-4e48-8418-a06815318dc5" (UID: "739c8ff6-4b97-4e48-8418-a06815318dc5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.570624 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.570678 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kcnl\" (UniqueName: \"kubernetes.io/projected/739c8ff6-4b97-4e48-8418-a06815318dc5-kube-api-access-8kcnl\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.570691 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.570701 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.570709 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/739c8ff6-4b97-4e48-8418-a06815318dc5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.848841 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e","Type":"ContainerStarted","Data":"721f9bf3408d647620f0f05aca8a2d43ff3da2a48ee1b52db1759e08089f8bcd"} Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.849246 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"a1bbb8af-58b4-4eff-9e81-5206ecc06b2e","Type":"ContainerStarted","Data":"8005a5d7ab16406504b7f6ac3412a360982e3db722ffcdefba3cc445ceb2d35d"} Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.849339 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.851066 4781 generic.go:334] "Generic (PLEG): container finished" podID="739c8ff6-4b97-4e48-8418-a06815318dc5" containerID="2b838618dd8ab1f707ac9a204c76f3665f441eabf6c357f43bd02b3b105896dc" exitCode=0 Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.851118 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" event={"ID":"739c8ff6-4b97-4e48-8418-a06815318dc5","Type":"ContainerDied","Data":"2b838618dd8ab1f707ac9a204c76f3665f441eabf6c357f43bd02b3b105896dc"} Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.851141 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" event={"ID":"739c8ff6-4b97-4e48-8418-a06815318dc5","Type":"ContainerDied","Data":"bcaa7325c1f3624012c89f97d52aa5a25277f0d46e931f3d61877f13a656eb7d"} Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.851159 4781 scope.go:117] "RemoveContainer" containerID="2b838618dd8ab1f707ac9a204c76f3665f441eabf6c357f43bd02b3b105896dc" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.851280 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db7757ddc-h7t2l" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.855745 4781 generic.go:334] "Generic (PLEG): container finished" podID="c90921e9-885e-436c-833a-8f02c075c898" containerID="970549ce7c1ea4b18c2df6edbae4e26e00ca2f5c23efb145ac0e004ec668fe7e" exitCode=0 Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.855793 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b342-account-create-update-b5hpz" event={"ID":"c90921e9-885e-436c-833a-8f02c075c898","Type":"ContainerDied","Data":"970549ce7c1ea4b18c2df6edbae4e26e00ca2f5c23efb145ac0e004ec668fe7e"} Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.857636 4781 generic.go:334] "Generic (PLEG): container finished" podID="bf42e858-785c-484b-ab72-cddeddbdd145" containerID="673b071febd5166e89e6fad72a1172c8e6eec2af6fa3dadd1fa5727d0a3eed5d" exitCode=0 Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.857701 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zxgkj" event={"ID":"bf42e858-785c-484b-ab72-cddeddbdd145","Type":"ContainerDied","Data":"673b071febd5166e89e6fad72a1172c8e6eec2af6fa3dadd1fa5727d0a3eed5d"} Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.859214 4781 generic.go:334] "Generic (PLEG): container finished" podID="8d9b08ae-f5f5-430b-9c0c-f54aafa4261e" containerID="14e2b4d5b8812b13b2b7ca3b30c52c8fe7b05a83ca68dde969ee627990d16d0f" exitCode=0 Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.859405 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e0fb-account-create-update-fwgdh" event={"ID":"8d9b08ae-f5f5-430b-9c0c-f54aafa4261e","Type":"ContainerDied","Data":"14e2b4d5b8812b13b2b7ca3b30c52c8fe7b05a83ca68dde969ee627990d16d0f"} Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.869631 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.831132377 podStartE2EDuration="4.869619051s" podCreationTimestamp="2025-12-08 20:22:28 +0000 UTC" firstStartedPulling="2025-12-08 20:22:29.603232172 +0000 UTC m=+1065.754515549" lastFinishedPulling="2025-12-08 20:22:31.641718846 +0000 UTC m=+1067.793002223" observedRunningTime="2025-12-08 20:22:32.868996563 +0000 UTC m=+1069.020279940" watchObservedRunningTime="2025-12-08 20:22:32.869619051 +0000 UTC m=+1069.020902428" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.883307 4781 scope.go:117] "RemoveContainer" containerID="08b79665ee5d67bdfb2450aac35e4469f658d2c61fc3137d40a4ff5689199b68" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.926689 4781 scope.go:117] "RemoveContainer" containerID="2b838618dd8ab1f707ac9a204c76f3665f441eabf6c357f43bd02b3b105896dc" Dec 08 20:22:32 crc kubenswrapper[4781]: E1208 20:22:32.927231 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b838618dd8ab1f707ac9a204c76f3665f441eabf6c357f43bd02b3b105896dc\": container with ID starting with 2b838618dd8ab1f707ac9a204c76f3665f441eabf6c357f43bd02b3b105896dc not found: ID does not exist" containerID="2b838618dd8ab1f707ac9a204c76f3665f441eabf6c357f43bd02b3b105896dc" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.927263 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b838618dd8ab1f707ac9a204c76f3665f441eabf6c357f43bd02b3b105896dc"} err="failed to get container status \"2b838618dd8ab1f707ac9a204c76f3665f441eabf6c357f43bd02b3b105896dc\": rpc error: code = NotFound desc = could not find container \"2b838618dd8ab1f707ac9a204c76f3665f441eabf6c357f43bd02b3b105896dc\": container with ID starting with 2b838618dd8ab1f707ac9a204c76f3665f441eabf6c357f43bd02b3b105896dc not found: ID does not exist" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.927287 4781 scope.go:117] "RemoveContainer" containerID="08b79665ee5d67bdfb2450aac35e4469f658d2c61fc3137d40a4ff5689199b68" Dec 08 20:22:32 crc kubenswrapper[4781]: E1208 20:22:32.929974 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08b79665ee5d67bdfb2450aac35e4469f658d2c61fc3137d40a4ff5689199b68\": container with ID starting with 08b79665ee5d67bdfb2450aac35e4469f658d2c61fc3137d40a4ff5689199b68 not found: ID does not exist" containerID="08b79665ee5d67bdfb2450aac35e4469f658d2c61fc3137d40a4ff5689199b68" Dec 08 20:22:32 crc kubenswrapper[4781]: I1208 20:22:32.930001 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08b79665ee5d67bdfb2450aac35e4469f658d2c61fc3137d40a4ff5689199b68"} err="failed to get container status \"08b79665ee5d67bdfb2450aac35e4469f658d2c61fc3137d40a4ff5689199b68\": rpc error: code = NotFound desc = could not find container \"08b79665ee5d67bdfb2450aac35e4469f658d2c61fc3137d40a4ff5689199b68\": container with ID starting with 08b79665ee5d67bdfb2450aac35e4469f658d2c61fc3137d40a4ff5689199b68 not found: ID does not exist" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.000293 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-h7t2l"] Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.006826 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-h7t2l"] Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.429114 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhmq2" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.439546 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-trdlp" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.449269 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ece-account-create-update-q2l9w" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.595216 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl6qv\" (UniqueName: \"kubernetes.io/projected/d3fd2e65-3874-4123-98d0-38e820feb05d-kube-api-access-wl6qv\") pod \"d3fd2e65-3874-4123-98d0-38e820feb05d\" (UID: \"d3fd2e65-3874-4123-98d0-38e820feb05d\") " Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.595273 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f128781-d8dc-4990-a2b4-bbe58950f8c4-operator-scripts\") pod \"2f128781-d8dc-4990-a2b4-bbe58950f8c4\" (UID: \"2f128781-d8dc-4990-a2b4-bbe58950f8c4\") " Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.595371 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxxtq\" (UniqueName: \"kubernetes.io/projected/2f128781-d8dc-4990-a2b4-bbe58950f8c4-kube-api-access-dxxtq\") pod \"2f128781-d8dc-4990-a2b4-bbe58950f8c4\" (UID: \"2f128781-d8dc-4990-a2b4-bbe58950f8c4\") " Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.595400 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c3df1d-d4ef-4f84-96a8-a4b6b2345844-operator-scripts\") pod \"b2c3df1d-d4ef-4f84-96a8-a4b6b2345844\" (UID: \"b2c3df1d-d4ef-4f84-96a8-a4b6b2345844\") " Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.595416 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3fd2e65-3874-4123-98d0-38e820feb05d-operator-scripts\") pod \"d3fd2e65-3874-4123-98d0-38e820feb05d\" (UID: \"d3fd2e65-3874-4123-98d0-38e820feb05d\") " Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.595444 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt27r\" (UniqueName: \"kubernetes.io/projected/b2c3df1d-d4ef-4f84-96a8-a4b6b2345844-kube-api-access-nt27r\") pod \"b2c3df1d-d4ef-4f84-96a8-a4b6b2345844\" (UID: \"b2c3df1d-d4ef-4f84-96a8-a4b6b2345844\") " Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.602498 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c3df1d-d4ef-4f84-96a8-a4b6b2345844-kube-api-access-nt27r" (OuterVolumeSpecName: "kube-api-access-nt27r") pod "b2c3df1d-d4ef-4f84-96a8-a4b6b2345844" (UID: "b2c3df1d-d4ef-4f84-96a8-a4b6b2345844"). InnerVolumeSpecName "kube-api-access-nt27r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.602844 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2c3df1d-d4ef-4f84-96a8-a4b6b2345844-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2c3df1d-d4ef-4f84-96a8-a4b6b2345844" (UID: "b2c3df1d-d4ef-4f84-96a8-a4b6b2345844"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.603211 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f128781-d8dc-4990-a2b4-bbe58950f8c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2f128781-d8dc-4990-a2b4-bbe58950f8c4" (UID: "2f128781-d8dc-4990-a2b4-bbe58950f8c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.603274 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3fd2e65-3874-4123-98d0-38e820feb05d-kube-api-access-wl6qv" (OuterVolumeSpecName: "kube-api-access-wl6qv") pod "d3fd2e65-3874-4123-98d0-38e820feb05d" (UID: "d3fd2e65-3874-4123-98d0-38e820feb05d"). InnerVolumeSpecName "kube-api-access-wl6qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.605224 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3fd2e65-3874-4123-98d0-38e820feb05d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3fd2e65-3874-4123-98d0-38e820feb05d" (UID: "d3fd2e65-3874-4123-98d0-38e820feb05d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.611094 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f128781-d8dc-4990-a2b4-bbe58950f8c4-kube-api-access-dxxtq" (OuterVolumeSpecName: "kube-api-access-dxxtq") pod "2f128781-d8dc-4990-a2b4-bbe58950f8c4" (UID: "2f128781-d8dc-4990-a2b4-bbe58950f8c4"). InnerVolumeSpecName "kube-api-access-dxxtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.697072 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxxtq\" (UniqueName: \"kubernetes.io/projected/2f128781-d8dc-4990-a2b4-bbe58950f8c4-kube-api-access-dxxtq\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.697114 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c3df1d-d4ef-4f84-96a8-a4b6b2345844-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.697128 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3fd2e65-3874-4123-98d0-38e820feb05d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.697141 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt27r\" (UniqueName: \"kubernetes.io/projected/b2c3df1d-d4ef-4f84-96a8-a4b6b2345844-kube-api-access-nt27r\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.697153 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl6qv\" (UniqueName: \"kubernetes.io/projected/d3fd2e65-3874-4123-98d0-38e820feb05d-kube-api-access-wl6qv\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.697167 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2f128781-d8dc-4990-a2b4-bbe58950f8c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.869530 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5ece-account-create-update-q2l9w" event={"ID":"2f128781-d8dc-4990-a2b4-bbe58950f8c4","Type":"ContainerDied","Data":"e6c3b25e181bb65d708309c855557590d8b85c74a087a0d666f573d4fa1eae66"} Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.869582 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6c3b25e181bb65d708309c855557590d8b85c74a087a0d666f573d4fa1eae66" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.869643 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5ece-account-create-update-q2l9w" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.875684 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-vhmq2" event={"ID":"d3fd2e65-3874-4123-98d0-38e820feb05d","Type":"ContainerDied","Data":"4ae95672cfd87cb81dbcf8dbafbbbd5bb7f4221f18ad1f70a0b672638496e427"} Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.876011 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ae95672cfd87cb81dbcf8dbafbbbd5bb7f4221f18ad1f70a0b672638496e427" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.876070 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-vhmq2" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.879722 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-trdlp" event={"ID":"b2c3df1d-d4ef-4f84-96a8-a4b6b2345844","Type":"ContainerDied","Data":"89d3de6f0c8ce0cbe2b56f4c01c80891a58cf5a75b7c69e6d7616cb90697cd01"} Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.879748 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-trdlp" Dec 08 20:22:33 crc kubenswrapper[4781]: I1208 20:22:33.879775 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89d3de6f0c8ce0cbe2b56f4c01c80891a58cf5a75b7c69e6d7616cb90697cd01" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.155032 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="739c8ff6-4b97-4e48-8418-a06815318dc5" path="/var/lib/kubelet/pods/739c8ff6-4b97-4e48-8418-a06815318dc5/volumes" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.233166 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b342-account-create-update-b5hpz" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.412389 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbgjk\" (UniqueName: \"kubernetes.io/projected/c90921e9-885e-436c-833a-8f02c075c898-kube-api-access-hbgjk\") pod \"c90921e9-885e-436c-833a-8f02c075c898\" (UID: \"c90921e9-885e-436c-833a-8f02c075c898\") " Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.412640 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90921e9-885e-436c-833a-8f02c075c898-operator-scripts\") pod \"c90921e9-885e-436c-833a-8f02c075c898\" (UID: \"c90921e9-885e-436c-833a-8f02c075c898\") " Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.413242 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c90921e9-885e-436c-833a-8f02c075c898-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c90921e9-885e-436c-833a-8f02c075c898" (UID: "c90921e9-885e-436c-833a-8f02c075c898"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.417427 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90921e9-885e-436c-833a-8f02c075c898-kube-api-access-hbgjk" (OuterVolumeSpecName: "kube-api-access-hbgjk") pod "c90921e9-885e-436c-833a-8f02c075c898" (UID: "c90921e9-885e-436c-833a-8f02c075c898"). InnerVolumeSpecName "kube-api-access-hbgjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.442349 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e0fb-account-create-update-fwgdh" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.447321 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zxgkj" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.516283 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8p52\" (UniqueName: \"kubernetes.io/projected/bf42e858-785c-484b-ab72-cddeddbdd145-kube-api-access-p8p52\") pod \"bf42e858-785c-484b-ab72-cddeddbdd145\" (UID: \"bf42e858-785c-484b-ab72-cddeddbdd145\") " Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.516384 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6r5l\" (UniqueName: \"kubernetes.io/projected/8d9b08ae-f5f5-430b-9c0c-f54aafa4261e-kube-api-access-t6r5l\") pod \"8d9b08ae-f5f5-430b-9c0c-f54aafa4261e\" (UID: \"8d9b08ae-f5f5-430b-9c0c-f54aafa4261e\") " Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.516458 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf42e858-785c-484b-ab72-cddeddbdd145-operator-scripts\") pod \"bf42e858-785c-484b-ab72-cddeddbdd145\" (UID: \"bf42e858-785c-484b-ab72-cddeddbdd145\") " Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.516876 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbgjk\" (UniqueName: \"kubernetes.io/projected/c90921e9-885e-436c-833a-8f02c075c898-kube-api-access-hbgjk\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.516893 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c90921e9-885e-436c-833a-8f02c075c898-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.517393 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf42e858-785c-484b-ab72-cddeddbdd145-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf42e858-785c-484b-ab72-cddeddbdd145" (UID: "bf42e858-785c-484b-ab72-cddeddbdd145"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.519652 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf42e858-785c-484b-ab72-cddeddbdd145-kube-api-access-p8p52" (OuterVolumeSpecName: "kube-api-access-p8p52") pod "bf42e858-785c-484b-ab72-cddeddbdd145" (UID: "bf42e858-785c-484b-ab72-cddeddbdd145"). InnerVolumeSpecName "kube-api-access-p8p52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.520978 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9b08ae-f5f5-430b-9c0c-f54aafa4261e-kube-api-access-t6r5l" (OuterVolumeSpecName: "kube-api-access-t6r5l") pod "8d9b08ae-f5f5-430b-9c0c-f54aafa4261e" (UID: "8d9b08ae-f5f5-430b-9c0c-f54aafa4261e"). InnerVolumeSpecName "kube-api-access-t6r5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.617733 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d9b08ae-f5f5-430b-9c0c-f54aafa4261e-operator-scripts\") pod \"8d9b08ae-f5f5-430b-9c0c-f54aafa4261e\" (UID: \"8d9b08ae-f5f5-430b-9c0c-f54aafa4261e\") " Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.618311 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d9b08ae-f5f5-430b-9c0c-f54aafa4261e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d9b08ae-f5f5-430b-9c0c-f54aafa4261e" (UID: "8d9b08ae-f5f5-430b-9c0c-f54aafa4261e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.618748 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8p52\" (UniqueName: \"kubernetes.io/projected/bf42e858-785c-484b-ab72-cddeddbdd145-kube-api-access-p8p52\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.618806 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6r5l\" (UniqueName: \"kubernetes.io/projected/8d9b08ae-f5f5-430b-9c0c-f54aafa4261e-kube-api-access-t6r5l\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.618839 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf42e858-785c-484b-ab72-cddeddbdd145-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.618864 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d9b08ae-f5f5-430b-9c0c-f54aafa4261e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.894844 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b342-account-create-update-b5hpz" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.895119 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b342-account-create-update-b5hpz" event={"ID":"c90921e9-885e-436c-833a-8f02c075c898","Type":"ContainerDied","Data":"b2080d228c77d8102828b23663ff8cac408db3e880a18350f1a4b8b69d7b59b8"} Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.895296 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2080d228c77d8102828b23663ff8cac408db3e880a18350f1a4b8b69d7b59b8" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.896403 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-zxgkj" event={"ID":"bf42e858-785c-484b-ab72-cddeddbdd145","Type":"ContainerDied","Data":"270a7cf099d2cb2aa5dd6593225b907b655a909eb63764e65a179b7d63419a93"} Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.896431 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="270a7cf099d2cb2aa5dd6593225b907b655a909eb63764e65a179b7d63419a93" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.896416 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-zxgkj" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.899175 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e0fb-account-create-update-fwgdh" event={"ID":"8d9b08ae-f5f5-430b-9c0c-f54aafa4261e","Type":"ContainerDied","Data":"a54a9032cd1eac7358e285528447404c63a333189acb9aacdc025e26345876df"} Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.899213 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a54a9032cd1eac7358e285528447404c63a333189acb9aacdc025e26345876df" Dec 08 20:22:34 crc kubenswrapper[4781]: I1208 20:22:34.899221 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e0fb-account-create-update-fwgdh" Dec 08 20:22:36 crc kubenswrapper[4781]: I1208 20:22:36.921487 4781 generic.go:334] "Generic (PLEG): container finished" podID="ec594735-a472-4c13-b98b-453a80fceb1d" containerID="673f87594272412dd3b15b2bf474616751b144ec8cac22b7e72a51db69d7cd49" exitCode=0 Dec 08 20:22:36 crc kubenswrapper[4781]: I1208 20:22:36.921823 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cdbxl" event={"ID":"ec594735-a472-4c13-b98b-453a80fceb1d","Type":"ContainerDied","Data":"673f87594272412dd3b15b2bf474616751b144ec8cac22b7e72a51db69d7cd49"} Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.332613 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.475544 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec594735-a472-4c13-b98b-453a80fceb1d-scripts\") pod \"ec594735-a472-4c13-b98b-453a80fceb1d\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.475593 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b7hs\" (UniqueName: \"kubernetes.io/projected/ec594735-a472-4c13-b98b-453a80fceb1d-kube-api-access-8b7hs\") pod \"ec594735-a472-4c13-b98b-453a80fceb1d\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.475635 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec594735-a472-4c13-b98b-453a80fceb1d-etc-swift\") pod \"ec594735-a472-4c13-b98b-453a80fceb1d\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.475662 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-combined-ca-bundle\") pod \"ec594735-a472-4c13-b98b-453a80fceb1d\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.475738 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-swiftconf\") pod \"ec594735-a472-4c13-b98b-453a80fceb1d\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.475771 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-dispersionconf\") pod \"ec594735-a472-4c13-b98b-453a80fceb1d\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.475827 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec594735-a472-4c13-b98b-453a80fceb1d-ring-data-devices\") pod \"ec594735-a472-4c13-b98b-453a80fceb1d\" (UID: \"ec594735-a472-4c13-b98b-453a80fceb1d\") " Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.476768 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec594735-a472-4c13-b98b-453a80fceb1d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ec594735-a472-4c13-b98b-453a80fceb1d" (UID: "ec594735-a472-4c13-b98b-453a80fceb1d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.477316 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec594735-a472-4c13-b98b-453a80fceb1d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ec594735-a472-4c13-b98b-453a80fceb1d" (UID: "ec594735-a472-4c13-b98b-453a80fceb1d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.481680 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec594735-a472-4c13-b98b-453a80fceb1d-kube-api-access-8b7hs" (OuterVolumeSpecName: "kube-api-access-8b7hs") pod "ec594735-a472-4c13-b98b-453a80fceb1d" (UID: "ec594735-a472-4c13-b98b-453a80fceb1d"). InnerVolumeSpecName "kube-api-access-8b7hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.484826 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ec594735-a472-4c13-b98b-453a80fceb1d" (UID: "ec594735-a472-4c13-b98b-453a80fceb1d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.498818 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec594735-a472-4c13-b98b-453a80fceb1d-scripts" (OuterVolumeSpecName: "scripts") pod "ec594735-a472-4c13-b98b-453a80fceb1d" (UID: "ec594735-a472-4c13-b98b-453a80fceb1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.500070 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ec594735-a472-4c13-b98b-453a80fceb1d" (UID: "ec594735-a472-4c13-b98b-453a80fceb1d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.502280 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec594735-a472-4c13-b98b-453a80fceb1d" (UID: "ec594735-a472-4c13-b98b-453a80fceb1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.578672 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.578860 4781 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ec594735-a472-4c13-b98b-453a80fceb1d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.578885 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ec594735-a472-4c13-b98b-453a80fceb1d-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.578901 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b7hs\" (UniqueName: \"kubernetes.io/projected/ec594735-a472-4c13-b98b-453a80fceb1d-kube-api-access-8b7hs\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.578939 4781 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ec594735-a472-4c13-b98b-453a80fceb1d-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.578954 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.578968 4781 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.578982 4781 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ec594735-a472-4c13-b98b-453a80fceb1d-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.583392 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/75ab4f11-7508-4813-83bd-05ef029af585-etc-swift\") pod \"swift-storage-0\" (UID: \"75ab4f11-7508-4813-83bd-05ef029af585\") " pod="openstack/swift-storage-0" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.793580 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.941058 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-cdbxl" event={"ID":"ec594735-a472-4c13-b98b-453a80fceb1d","Type":"ContainerDied","Data":"a96aaf6be3a269be48e2693de169a6c47e1eefc3b1dc7842a15028e30acb678f"} Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.941101 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a96aaf6be3a269be48e2693de169a6c47e1eefc3b1dc7842a15028e30acb678f" Dec 08 20:22:38 crc kubenswrapper[4781]: I1208 20:22:38.941178 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-cdbxl" Dec 08 20:22:39 crc kubenswrapper[4781]: E1208 20:22:39.037381 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec594735_a472_4c13_b98b_453a80fceb1d.slice/crio-a96aaf6be3a269be48e2693de169a6c47e1eefc3b1dc7842a15028e30acb678f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec594735_a472_4c13_b98b_453a80fceb1d.slice\": RecentStats: unable to find data in memory cache]" Dec 08 20:22:39 crc kubenswrapper[4781]: I1208 20:22:39.426151 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 08 20:22:39 crc kubenswrapper[4781]: I1208 20:22:39.950428 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"d85292b667ed87433cfd62cbfd88768b141b87fcbc81a0b3d6179483d8e71d0e"} Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.289245 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-b9gc6"] Dec 08 20:22:40 crc kubenswrapper[4781]: E1208 20:22:40.289986 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3fd2e65-3874-4123-98d0-38e820feb05d" containerName="mariadb-database-create" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290004 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3fd2e65-3874-4123-98d0-38e820feb05d" containerName="mariadb-database-create" Dec 08 20:22:40 crc kubenswrapper[4781]: E1208 20:22:40.290017 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec594735-a472-4c13-b98b-453a80fceb1d" containerName="swift-ring-rebalance" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290023 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec594735-a472-4c13-b98b-453a80fceb1d" containerName="swift-ring-rebalance" Dec 08 20:22:40 crc kubenswrapper[4781]: E1208 20:22:40.290033 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c90921e9-885e-436c-833a-8f02c075c898" containerName="mariadb-account-create-update" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290040 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90921e9-885e-436c-833a-8f02c075c898" containerName="mariadb-account-create-update" Dec 08 20:22:40 crc kubenswrapper[4781]: E1208 20:22:40.290056 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739c8ff6-4b97-4e48-8418-a06815318dc5" containerName="init" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290063 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="739c8ff6-4b97-4e48-8418-a06815318dc5" containerName="init" Dec 08 20:22:40 crc kubenswrapper[4781]: E1208 20:22:40.290074 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739c8ff6-4b97-4e48-8418-a06815318dc5" containerName="dnsmasq-dns" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290080 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="739c8ff6-4b97-4e48-8418-a06815318dc5" containerName="dnsmasq-dns" Dec 08 20:22:40 crc kubenswrapper[4781]: E1208 20:22:40.290092 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c3df1d-d4ef-4f84-96a8-a4b6b2345844" containerName="mariadb-database-create" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290099 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c3df1d-d4ef-4f84-96a8-a4b6b2345844" containerName="mariadb-database-create" Dec 08 20:22:40 crc kubenswrapper[4781]: E1208 20:22:40.290111 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf42e858-785c-484b-ab72-cddeddbdd145" containerName="mariadb-database-create" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290118 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf42e858-785c-484b-ab72-cddeddbdd145" containerName="mariadb-database-create" Dec 08 20:22:40 crc kubenswrapper[4781]: E1208 20:22:40.290149 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f128781-d8dc-4990-a2b4-bbe58950f8c4" containerName="mariadb-account-create-update" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290157 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f128781-d8dc-4990-a2b4-bbe58950f8c4" containerName="mariadb-account-create-update" Dec 08 20:22:40 crc kubenswrapper[4781]: E1208 20:22:40.290167 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9b08ae-f5f5-430b-9c0c-f54aafa4261e" containerName="mariadb-account-create-update" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290175 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9b08ae-f5f5-430b-9c0c-f54aafa4261e" containerName="mariadb-account-create-update" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290330 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c90921e9-885e-436c-833a-8f02c075c898" containerName="mariadb-account-create-update" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290352 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c3df1d-d4ef-4f84-96a8-a4b6b2345844" containerName="mariadb-database-create" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290365 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="739c8ff6-4b97-4e48-8418-a06815318dc5" containerName="dnsmasq-dns" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290374 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9b08ae-f5f5-430b-9c0c-f54aafa4261e" containerName="mariadb-account-create-update" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290381 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3fd2e65-3874-4123-98d0-38e820feb05d" containerName="mariadb-database-create" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290391 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec594735-a472-4c13-b98b-453a80fceb1d" containerName="swift-ring-rebalance" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290402 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f128781-d8dc-4990-a2b4-bbe58950f8c4" containerName="mariadb-account-create-update" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290417 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf42e858-785c-484b-ab72-cddeddbdd145" containerName="mariadb-database-create" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.290904 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.299317 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b9gc6"] Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.306389 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.306617 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4d2w8" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.411408 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-config-data\") pod \"glance-db-sync-b9gc6\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.411482 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-db-sync-config-data\") pod \"glance-db-sync-b9gc6\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.411591 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-combined-ca-bundle\") pod \"glance-db-sync-b9gc6\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.411683 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lcns\" (UniqueName: \"kubernetes.io/projected/fa6f1315-4ed1-4ae4-988a-81375adf148b-kube-api-access-8lcns\") pod \"glance-db-sync-b9gc6\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.513420 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-combined-ca-bundle\") pod \"glance-db-sync-b9gc6\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.513505 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lcns\" (UniqueName: \"kubernetes.io/projected/fa6f1315-4ed1-4ae4-988a-81375adf148b-kube-api-access-8lcns\") pod \"glance-db-sync-b9gc6\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.513567 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-config-data\") pod \"glance-db-sync-b9gc6\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.513630 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-db-sync-config-data\") pod \"glance-db-sync-b9gc6\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.518654 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-db-sync-config-data\") pod \"glance-db-sync-b9gc6\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.523026 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-combined-ca-bundle\") pod \"glance-db-sync-b9gc6\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.524666 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-config-data\") pod \"glance-db-sync-b9gc6\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.535414 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lcns\" (UniqueName: \"kubernetes.io/projected/fa6f1315-4ed1-4ae4-988a-81375adf148b-kube-api-access-8lcns\") pod \"glance-db-sync-b9gc6\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:40 crc kubenswrapper[4781]: I1208 20:22:40.626397 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b9gc6" Dec 08 20:22:41 crc kubenswrapper[4781]: I1208 20:22:41.113625 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-b9gc6"] Dec 08 20:22:41 crc kubenswrapper[4781]: I1208 20:22:41.969626 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"d1851a9c9c561d01718a5a40210d237115acef047993bb2f9bf59efb8c87cee2"} Dec 08 20:22:41 crc kubenswrapper[4781]: I1208 20:22:41.969748 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"83eb91cc5378d63a94b6dbe415b6b22b949dda41613f51d8121711dc56970f98"} Dec 08 20:22:41 crc kubenswrapper[4781]: I1208 20:22:41.972404 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b9gc6" event={"ID":"fa6f1315-4ed1-4ae4-988a-81375adf148b","Type":"ContainerStarted","Data":"b997a9cac196cb6e18cf6b915eda87a971c935a7b6f65be53952a19a0970084a"} Dec 08 20:22:42 crc kubenswrapper[4781]: I1208 20:22:42.982492 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"d50fe204b338f4f9fee030fc24a27508088f59bb486ca58d6321d55bcc175dc0"} Dec 08 20:22:42 crc kubenswrapper[4781]: I1208 20:22:42.982546 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"f75a00dbbf515e2d864a2634d7d11fd2f732d8f71def5997aa65af6bbbf2b65e"} Dec 08 20:22:44 crc kubenswrapper[4781]: I1208 20:22:44.225694 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 08 20:22:44 crc kubenswrapper[4781]: I1208 20:22:44.892816 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kf4ss" podUID="a7f5afd4-05f3-4954-9dc9-3efa47c22b85" containerName="ovn-controller" probeResult="failure" output=< Dec 08 20:22:44 crc kubenswrapper[4781]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 08 20:22:44 crc kubenswrapper[4781]: > Dec 08 20:22:45 crc kubenswrapper[4781]: I1208 20:22:45.006389 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"48f1e3d7e0c98e6b6ba6d73403f429151e887678d5f9eac44dd5b630bf9cec69"} Dec 08 20:22:46 crc kubenswrapper[4781]: I1208 20:22:46.027300 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"63c67044c19ea3285418afb32c530f7c75dfbc666a17012b202fc9aab291d8aa"} Dec 08 20:22:46 crc kubenswrapper[4781]: I1208 20:22:46.027654 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"a820f1fe1ca338fc1cd46dd856e42695f212f5bf4905bf47d2cb80d090641d60"} Dec 08 20:22:47 crc kubenswrapper[4781]: I1208 20:22:47.050197 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"2a3da742e2c9351794b1b5619bed8eae67c6da7cd78bb3b78ada2aaac416e4fa"} Dec 08 20:22:49 crc kubenswrapper[4781]: I1208 20:22:49.981729 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kf4ss" podUID="a7f5afd4-05f3-4954-9dc9-3efa47c22b85" containerName="ovn-controller" probeResult="failure" output=< Dec 08 20:22:49 crc kubenswrapper[4781]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 08 20:22:49 crc kubenswrapper[4781]: > Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.049256 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.050512 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9cv7c" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.376756 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kf4ss-config-85rrq"] Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.378031 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.382237 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.389784 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kf4ss-config-85rrq"] Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.744639 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-run\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.744826 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-log-ovn\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.744933 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ztxz\" (UniqueName: \"kubernetes.io/projected/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-kube-api-access-4ztxz\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.745038 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-run-ovn\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.745094 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-additional-scripts\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.745135 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-scripts\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.846453 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-run-ovn\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.846520 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-additional-scripts\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.846552 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-scripts\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.846576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-run\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.846642 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-log-ovn\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.846682 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ztxz\" (UniqueName: \"kubernetes.io/projected/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-kube-api-access-4ztxz\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.846733 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-run-ovn\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.846818 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-run\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.847051 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-log-ovn\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.847890 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-additional-scripts\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.849201 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-scripts\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:50 crc kubenswrapper[4781]: I1208 20:22:50.914426 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ztxz\" (UniqueName: \"kubernetes.io/projected/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-kube-api-access-4ztxz\") pod \"ovn-controller-kf4ss-config-85rrq\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:51 crc kubenswrapper[4781]: I1208 20:22:51.008399 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:22:52 crc kubenswrapper[4781]: I1208 20:22:52.090157 4781 generic.go:334] "Generic (PLEG): container finished" podID="202f9454-de0b-4a09-abb6-dacbea9b5fa4" containerID="3f89ebe6b8f5c4b9493038b512e65d98b842d3b16dc59eb207539df3c792e9d9" exitCode=0 Dec 08 20:22:52 crc kubenswrapper[4781]: I1208 20:22:52.090205 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"202f9454-de0b-4a09-abb6-dacbea9b5fa4","Type":"ContainerDied","Data":"3f89ebe6b8f5c4b9493038b512e65d98b842d3b16dc59eb207539df3c792e9d9"} Dec 08 20:22:52 crc kubenswrapper[4781]: I1208 20:22:52.093766 4781 generic.go:334] "Generic (PLEG): container finished" podID="9192ae66-92ec-4618-aecd-3ec306da8525" containerID="53cd4c4a695cb9ad9bb144e86dd7dccccf8ed41ca0427fc0f9f19b84cf5df558" exitCode=0 Dec 08 20:22:52 crc kubenswrapper[4781]: I1208 20:22:52.093824 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9192ae66-92ec-4618-aecd-3ec306da8525","Type":"ContainerDied","Data":"53cd4c4a695cb9ad9bb144e86dd7dccccf8ed41ca0427fc0f9f19b84cf5df558"} Dec 08 20:22:54 crc kubenswrapper[4781]: I1208 20:22:54.934865 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kf4ss" podUID="a7f5afd4-05f3-4954-9dc9-3efa47c22b85" containerName="ovn-controller" probeResult="failure" output=< Dec 08 20:22:54 crc kubenswrapper[4781]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 08 20:22:54 crc kubenswrapper[4781]: > Dec 08 20:22:57 crc kubenswrapper[4781]: E1208 20:22:57.087191 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63" Dec 08 20:22:57 crc kubenswrapper[4781]: E1208 20:22:57.087650 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8lcns,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-b9gc6_openstack(fa6f1315-4ed1-4ae4-988a-81375adf148b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:22:57 crc kubenswrapper[4781]: E1208 20:22:57.088843 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-b9gc6" podUID="fa6f1315-4ed1-4ae4-988a-81375adf148b" Dec 08 20:22:57 crc kubenswrapper[4781]: E1208 20:22:57.163062 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63\\\"\"" pod="openstack/glance-db-sync-b9gc6" podUID="fa6f1315-4ed1-4ae4-988a-81375adf148b" Dec 08 20:22:57 crc kubenswrapper[4781]: I1208 20:22:57.796079 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kf4ss-config-85rrq"] Dec 08 20:22:57 crc kubenswrapper[4781]: W1208 20:22:57.796515 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ba82277_9ed7_4c8c_ab3d_7b34d7cdffa3.slice/crio-045a013d09244b0825819a26437f1280d03cc3fb579e7605b6748436d16b335e WatchSource:0}: Error finding container 045a013d09244b0825819a26437f1280d03cc3fb579e7605b6748436d16b335e: Status 404 returned error can't find the container with id 045a013d09244b0825819a26437f1280d03cc3fb579e7605b6748436d16b335e Dec 08 20:22:58 crc kubenswrapper[4781]: I1208 20:22:58.252442 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kf4ss-config-85rrq" event={"ID":"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3","Type":"ContainerStarted","Data":"045a013d09244b0825819a26437f1280d03cc3fb579e7605b6748436d16b335e"} Dec 08 20:22:58 crc kubenswrapper[4781]: I1208 20:22:58.332897 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"dab3cb9945de67cec54cc3081794d6102ab934add90a01169d5c99db168ec065"} Dec 08 20:22:58 crc kubenswrapper[4781]: I1208 20:22:58.332970 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"e3a4587374695e521ea1910f6abf45edc02ff9d8daf6b3bf4be1999e0993d5e7"} Dec 08 20:22:58 crc kubenswrapper[4781]: I1208 20:22:58.332987 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"5c1efea7388920c70dda2fd02214079ac566440a37ba26594f14cfa1b94a064d"} Dec 08 20:22:58 crc kubenswrapper[4781]: I1208 20:22:58.339076 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9192ae66-92ec-4618-aecd-3ec306da8525","Type":"ContainerStarted","Data":"13e7d24de8bed466a6fda859022625c9097117b1843aeba955fd03bd8c31fbb7"} Dec 08 20:22:58 crc kubenswrapper[4781]: I1208 20:22:58.340245 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 08 20:22:58 crc kubenswrapper[4781]: I1208 20:22:58.343114 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"202f9454-de0b-4a09-abb6-dacbea9b5fa4","Type":"ContainerStarted","Data":"0ac5f9ab1ce79b914a89ce8ace956b426166c3bc9dcb385340f4505993f3cf19"} Dec 08 20:22:58 crc kubenswrapper[4781]: I1208 20:22:58.343691 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:22:58 crc kubenswrapper[4781]: I1208 20:22:58.452927 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=48.730410953 podStartE2EDuration="1m24.452848099s" podCreationTimestamp="2025-12-08 20:21:34 +0000 UTC" firstStartedPulling="2025-12-08 20:21:36.955338835 +0000 UTC m=+1013.106622212" lastFinishedPulling="2025-12-08 20:22:12.677775981 +0000 UTC m=+1048.829059358" observedRunningTime="2025-12-08 20:22:58.448227166 +0000 UTC m=+1094.599510543" watchObservedRunningTime="2025-12-08 20:22:58.452848099 +0000 UTC m=+1094.604131476" Dec 08 20:22:58 crc kubenswrapper[4781]: I1208 20:22:58.589629 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.806046994 podStartE2EDuration="1m24.589614277s" podCreationTimestamp="2025-12-08 20:21:34 +0000 UTC" firstStartedPulling="2025-12-08 20:21:37.554579784 +0000 UTC m=+1013.705863161" lastFinishedPulling="2025-12-08 20:22:12.338147067 +0000 UTC m=+1048.489430444" observedRunningTime="2025-12-08 20:22:58.588834474 +0000 UTC m=+1094.740117851" watchObservedRunningTime="2025-12-08 20:22:58.589614277 +0000 UTC m=+1094.740897644" Dec 08 20:22:59 crc kubenswrapper[4781]: I1208 20:22:59.354254 4781 generic.go:334] "Generic (PLEG): container finished" podID="8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3" containerID="54ea7d959f9d995a74b1c8d9cb6ee41eabf4c69c6381bc42921a17521ffb6825" exitCode=0 Dec 08 20:22:59 crc kubenswrapper[4781]: I1208 20:22:59.354343 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kf4ss-config-85rrq" event={"ID":"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3","Type":"ContainerDied","Data":"54ea7d959f9d995a74b1c8d9cb6ee41eabf4c69c6381bc42921a17521ffb6825"} Dec 08 20:22:59 crc kubenswrapper[4781]: I1208 20:22:59.363655 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"c56c975e6229a45e7069d0bb70b2bf2d0b62d18e42501c92edc6642b4f386054"} Dec 08 20:22:59 crc kubenswrapper[4781]: I1208 20:22:59.363694 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"472e4f4e271a25038a375ea8f8f38ccdf28cd66e82ec2f6ae2d110e79e50679b"} Dec 08 20:22:59 crc kubenswrapper[4781]: I1208 20:22:59.363705 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"671e3e7f65f3a0538cf2dec45548cb8f0f69c584799697bade6b1ad68f1e19b2"} Dec 08 20:22:59 crc kubenswrapper[4781]: I1208 20:22:59.867627 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-kf4ss" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.378135 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"75ab4f11-7508-4813-83bd-05ef029af585","Type":"ContainerStarted","Data":"b5f71bfb6dd35b066e8c2c4c3a924fbc54b2fd3a994b2cb195a270e1e438e077"} Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.421216 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.471460801 podStartE2EDuration="39.421183818s" podCreationTimestamp="2025-12-08 20:22:21 +0000 UTC" firstStartedPulling="2025-12-08 20:22:39.433902174 +0000 UTC m=+1075.585185551" lastFinishedPulling="2025-12-08 20:22:57.383625191 +0000 UTC m=+1093.534908568" observedRunningTime="2025-12-08 20:23:00.415833624 +0000 UTC m=+1096.567116991" watchObservedRunningTime="2025-12-08 20:23:00.421183818 +0000 UTC m=+1096.572467195" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.656257 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.717145 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3" (UID: "8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.717219 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-run-ovn\") pod \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.717424 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-run\") pod \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.717463 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-scripts\") pod \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.717633 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-run" (OuterVolumeSpecName: "var-run") pod "8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3" (UID: "8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.717651 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-log-ovn\") pod \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.717738 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3" (UID: "8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.717990 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-additional-scripts\") pod \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.718103 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ztxz\" (UniqueName: \"kubernetes.io/projected/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-kube-api-access-4ztxz\") pod \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\" (UID: \"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3\") " Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.718722 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-scripts" (OuterVolumeSpecName: "scripts") pod "8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3" (UID: "8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.720139 4781 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.720163 4781 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-run\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.720197 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.720210 4781 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.724077 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-kube-api-access-4ztxz" (OuterVolumeSpecName: "kube-api-access-4ztxz") pod "8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3" (UID: "8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3"). InnerVolumeSpecName "kube-api-access-4ztxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.728474 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3" (UID: "8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.729531 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-thdb8"] Dec 08 20:23:00 crc kubenswrapper[4781]: E1208 20:23:00.729951 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3" containerName="ovn-config" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.729967 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3" containerName="ovn-config" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.730117 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3" containerName="ovn-config" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.730974 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.737019 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.749738 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-thdb8"] Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.822134 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlmn9\" (UniqueName: \"kubernetes.io/projected/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-kube-api-access-vlmn9\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.822383 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-config\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.822439 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-dns-swift-storage-0\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.822579 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-dns-svc\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.822631 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.822670 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-ovsdbserver-nb\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.822785 4781 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.822806 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ztxz\" (UniqueName: \"kubernetes.io/projected/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3-kube-api-access-4ztxz\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.923793 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-dns-svc\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.923854 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.923887 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-ovsdbserver-nb\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.923998 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlmn9\" (UniqueName: \"kubernetes.io/projected/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-kube-api-access-vlmn9\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.924074 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-config\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.924102 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-dns-swift-storage-0\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.925193 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-dns-swift-storage-0\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.925792 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-dns-svc\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.926471 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-ovsdbserver-sb\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.926912 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-config\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.927192 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-ovsdbserver-nb\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:00 crc kubenswrapper[4781]: I1208 20:23:00.941635 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlmn9\" (UniqueName: \"kubernetes.io/projected/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-kube-api-access-vlmn9\") pod \"dnsmasq-dns-864b648dc7-thdb8\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.088951 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.387909 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kf4ss-config-85rrq" Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.389135 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kf4ss-config-85rrq" event={"ID":"8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3","Type":"ContainerDied","Data":"045a013d09244b0825819a26437f1280d03cc3fb579e7605b6748436d16b335e"} Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.389238 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="045a013d09244b0825819a26437f1280d03cc3fb579e7605b6748436d16b335e" Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.572015 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-thdb8"] Dec 08 20:23:01 crc kubenswrapper[4781]: W1208 20:23:01.573097 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65e8d6e4_c7b8_4aca_8df5_ec3f9a655cb0.slice/crio-111c34aed60f1c095052d202265e29ee0eaa2b18dc2eb31ddbf33b7e584e80b7 WatchSource:0}: Error finding container 111c34aed60f1c095052d202265e29ee0eaa2b18dc2eb31ddbf33b7e584e80b7: Status 404 returned error can't find the container with id 111c34aed60f1c095052d202265e29ee0eaa2b18dc2eb31ddbf33b7e584e80b7 Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.780613 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kf4ss-config-85rrq"] Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.787970 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-kf4ss-config-85rrq"] Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.872262 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kf4ss-config-g7j4w"] Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.873329 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.875257 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.886432 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kf4ss-config-g7j4w"] Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.952237 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f5686-c996-46f5-962a-09086bd0d771-scripts\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.952317 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-log-ovn\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.952340 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-run-ovn\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.952372 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-run\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.952396 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f5686-c996-46f5-962a-09086bd0d771-additional-scripts\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:01 crc kubenswrapper[4781]: I1208 20:23:01.952426 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jrqz\" (UniqueName: \"kubernetes.io/projected/8b9f5686-c996-46f5-962a-09086bd0d771-kube-api-access-7jrqz\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.054215 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f5686-c996-46f5-962a-09086bd0d771-scripts\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.054361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-log-ovn\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.054398 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-run-ovn\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.054446 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-run\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.054480 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f5686-c996-46f5-962a-09086bd0d771-additional-scripts\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.054522 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jrqz\" (UniqueName: \"kubernetes.io/projected/8b9f5686-c996-46f5-962a-09086bd0d771-kube-api-access-7jrqz\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.054734 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-log-ovn\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.054774 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-run\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.054736 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-run-ovn\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.055589 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f5686-c996-46f5-962a-09086bd0d771-additional-scripts\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.056540 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f5686-c996-46f5-962a-09086bd0d771-scripts\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.073848 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jrqz\" (UniqueName: \"kubernetes.io/projected/8b9f5686-c996-46f5-962a-09086bd0d771-kube-api-access-7jrqz\") pod \"ovn-controller-kf4ss-config-g7j4w\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.137310 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3" path="/var/lib/kubelet/pods/8ba82277-9ed7-4c8c-ab3d-7b34d7cdffa3/volumes" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.252942 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.397685 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" event={"ID":"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0","Type":"ContainerStarted","Data":"f6bf3a23dbe8bc5651b9e469feb229bb8066433f75d90cd8420d4a51f1fc73e3"} Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.398012 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" event={"ID":"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0","Type":"ContainerStarted","Data":"111c34aed60f1c095052d202265e29ee0eaa2b18dc2eb31ddbf33b7e584e80b7"} Dec 08 20:23:02 crc kubenswrapper[4781]: W1208 20:23:02.693397 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b9f5686_c996_46f5_962a_09086bd0d771.slice/crio-67885cb9cc4b27677657cec7c7a58f5c171d4a135038fb3f5a3fd38b4bd9b77c WatchSource:0}: Error finding container 67885cb9cc4b27677657cec7c7a58f5c171d4a135038fb3f5a3fd38b4bd9b77c: Status 404 returned error can't find the container with id 67885cb9cc4b27677657cec7c7a58f5c171d4a135038fb3f5a3fd38b4bd9b77c Dec 08 20:23:02 crc kubenswrapper[4781]: I1208 20:23:02.699905 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kf4ss-config-g7j4w"] Dec 08 20:23:03 crc kubenswrapper[4781]: I1208 20:23:03.407167 4781 generic.go:334] "Generic (PLEG): container finished" podID="8b9f5686-c996-46f5-962a-09086bd0d771" containerID="ba45309c5c5390af0106f9b0f6201b3213380d92620e4c59a935e3306863d0bf" exitCode=0 Dec 08 20:23:03 crc kubenswrapper[4781]: I1208 20:23:03.407229 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kf4ss-config-g7j4w" event={"ID":"8b9f5686-c996-46f5-962a-09086bd0d771","Type":"ContainerDied","Data":"ba45309c5c5390af0106f9b0f6201b3213380d92620e4c59a935e3306863d0bf"} Dec 08 20:23:03 crc kubenswrapper[4781]: I1208 20:23:03.407604 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kf4ss-config-g7j4w" event={"ID":"8b9f5686-c996-46f5-962a-09086bd0d771","Type":"ContainerStarted","Data":"67885cb9cc4b27677657cec7c7a58f5c171d4a135038fb3f5a3fd38b4bd9b77c"} Dec 08 20:23:03 crc kubenswrapper[4781]: I1208 20:23:03.411895 4781 generic.go:334] "Generic (PLEG): container finished" podID="65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" containerID="f6bf3a23dbe8bc5651b9e469feb229bb8066433f75d90cd8420d4a51f1fc73e3" exitCode=0 Dec 08 20:23:03 crc kubenswrapper[4781]: I1208 20:23:03.411969 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" event={"ID":"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0","Type":"ContainerDied","Data":"f6bf3a23dbe8bc5651b9e469feb229bb8066433f75d90cd8420d4a51f1fc73e3"} Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.421778 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" event={"ID":"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0","Type":"ContainerStarted","Data":"abc88af504abbc2dfab78c44b2fd97db55c6d667b635731f22ad2458fb465870"} Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.422094 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.445216 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" podStartSLOduration=4.445197276 podStartE2EDuration="4.445197276s" podCreationTimestamp="2025-12-08 20:23:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:23:04.442585491 +0000 UTC m=+1100.593868898" watchObservedRunningTime="2025-12-08 20:23:04.445197276 +0000 UTC m=+1100.596480663" Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.761036 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.897952 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-log-ovn\") pod \"8b9f5686-c996-46f5-962a-09086bd0d771\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.898026 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f5686-c996-46f5-962a-09086bd0d771-additional-scripts\") pod \"8b9f5686-c996-46f5-962a-09086bd0d771\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.898054 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jrqz\" (UniqueName: \"kubernetes.io/projected/8b9f5686-c996-46f5-962a-09086bd0d771-kube-api-access-7jrqz\") pod \"8b9f5686-c996-46f5-962a-09086bd0d771\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.898082 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-run-ovn\") pod \"8b9f5686-c996-46f5-962a-09086bd0d771\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.898156 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f5686-c996-46f5-962a-09086bd0d771-scripts\") pod \"8b9f5686-c996-46f5-962a-09086bd0d771\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.898172 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-run\") pod \"8b9f5686-c996-46f5-962a-09086bd0d771\" (UID: \"8b9f5686-c996-46f5-962a-09086bd0d771\") " Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.898236 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8b9f5686-c996-46f5-962a-09086bd0d771" (UID: "8b9f5686-c996-46f5-962a-09086bd0d771"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.898304 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8b9f5686-c996-46f5-962a-09086bd0d771" (UID: "8b9f5686-c996-46f5-962a-09086bd0d771"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.898305 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-run" (OuterVolumeSpecName: "var-run") pod "8b9f5686-c996-46f5-962a-09086bd0d771" (UID: "8b9f5686-c996-46f5-962a-09086bd0d771"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.898545 4781 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.898566 4781 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-run\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.898577 4781 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8b9f5686-c996-46f5-962a-09086bd0d771-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.898831 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9f5686-c996-46f5-962a-09086bd0d771-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8b9f5686-c996-46f5-962a-09086bd0d771" (UID: "8b9f5686-c996-46f5-962a-09086bd0d771"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.899106 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9f5686-c996-46f5-962a-09086bd0d771-scripts" (OuterVolumeSpecName: "scripts") pod "8b9f5686-c996-46f5-962a-09086bd0d771" (UID: "8b9f5686-c996-46f5-962a-09086bd0d771"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:04 crc kubenswrapper[4781]: I1208 20:23:04.904039 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9f5686-c996-46f5-962a-09086bd0d771-kube-api-access-7jrqz" (OuterVolumeSpecName: "kube-api-access-7jrqz") pod "8b9f5686-c996-46f5-962a-09086bd0d771" (UID: "8b9f5686-c996-46f5-962a-09086bd0d771"). InnerVolumeSpecName "kube-api-access-7jrqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:05 crc kubenswrapper[4781]: I1208 20:23:05.000269 4781 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f5686-c996-46f5-962a-09086bd0d771-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:05 crc kubenswrapper[4781]: I1208 20:23:05.000317 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jrqz\" (UniqueName: \"kubernetes.io/projected/8b9f5686-c996-46f5-962a-09086bd0d771-kube-api-access-7jrqz\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:05 crc kubenswrapper[4781]: I1208 20:23:05.000334 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b9f5686-c996-46f5-962a-09086bd0d771-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:05 crc kubenswrapper[4781]: I1208 20:23:05.429998 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kf4ss-config-g7j4w" event={"ID":"8b9f5686-c996-46f5-962a-09086bd0d771","Type":"ContainerDied","Data":"67885cb9cc4b27677657cec7c7a58f5c171d4a135038fb3f5a3fd38b4bd9b77c"} Dec 08 20:23:05 crc kubenswrapper[4781]: I1208 20:23:05.430052 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67885cb9cc4b27677657cec7c7a58f5c171d4a135038fb3f5a3fd38b4bd9b77c" Dec 08 20:23:05 crc kubenswrapper[4781]: I1208 20:23:05.430022 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kf4ss-config-g7j4w" Dec 08 20:23:05 crc kubenswrapper[4781]: I1208 20:23:05.844764 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kf4ss-config-g7j4w"] Dec 08 20:23:05 crc kubenswrapper[4781]: I1208 20:23:05.857735 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-kf4ss-config-g7j4w"] Dec 08 20:23:06 crc kubenswrapper[4781]: I1208 20:23:06.140599 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9f5686-c996-46f5-962a-09086bd0d771" path="/var/lib/kubelet/pods/8b9f5686-c996-46f5-962a-09086bd0d771/volumes" Dec 08 20:23:10 crc kubenswrapper[4781]: I1208 20:23:10.475735 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b9gc6" event={"ID":"fa6f1315-4ed1-4ae4-988a-81375adf148b","Type":"ContainerStarted","Data":"699d994318ce90c6ba5dd35df8ce7edef7238aad2c15d3a77cd4c0f41e6e6e72"} Dec 08 20:23:10 crc kubenswrapper[4781]: I1208 20:23:10.492386 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-b9gc6" podStartSLOduration=2.6624343550000003 podStartE2EDuration="30.492371578s" podCreationTimestamp="2025-12-08 20:22:40 +0000 UTC" firstStartedPulling="2025-12-08 20:22:41.117221968 +0000 UTC m=+1077.268505345" lastFinishedPulling="2025-12-08 20:23:08.947159191 +0000 UTC m=+1105.098442568" observedRunningTime="2025-12-08 20:23:10.491712999 +0000 UTC m=+1106.642996376" watchObservedRunningTime="2025-12-08 20:23:10.492371578 +0000 UTC m=+1106.643654955" Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.090264 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.148160 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-pfdb5"] Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.148467 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" podUID="269342fb-c8ab-4f10-8b53-e49c89d73afa" containerName="dnsmasq-dns" containerID="cri-o://a0db2c0721bf37b3c44650542a81d1fa594ce2e235bc2ed1eed7d31917f5b6a2" gracePeriod=10 Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.487458 4781 generic.go:334] "Generic (PLEG): container finished" podID="269342fb-c8ab-4f10-8b53-e49c89d73afa" containerID="a0db2c0721bf37b3c44650542a81d1fa594ce2e235bc2ed1eed7d31917f5b6a2" exitCode=0 Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.487538 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" event={"ID":"269342fb-c8ab-4f10-8b53-e49c89d73afa","Type":"ContainerDied","Data":"a0db2c0721bf37b3c44650542a81d1fa594ce2e235bc2ed1eed7d31917f5b6a2"} Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.570863 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.721868 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-config\") pod \"269342fb-c8ab-4f10-8b53-e49c89d73afa\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.722008 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-ovsdbserver-sb\") pod \"269342fb-c8ab-4f10-8b53-e49c89d73afa\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.722040 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg7j8\" (UniqueName: \"kubernetes.io/projected/269342fb-c8ab-4f10-8b53-e49c89d73afa-kube-api-access-bg7j8\") pod \"269342fb-c8ab-4f10-8b53-e49c89d73afa\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.722664 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-dns-svc\") pod \"269342fb-c8ab-4f10-8b53-e49c89d73afa\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.722817 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-ovsdbserver-nb\") pod \"269342fb-c8ab-4f10-8b53-e49c89d73afa\" (UID: \"269342fb-c8ab-4f10-8b53-e49c89d73afa\") " Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.727053 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/269342fb-c8ab-4f10-8b53-e49c89d73afa-kube-api-access-bg7j8" (OuterVolumeSpecName: "kube-api-access-bg7j8") pod "269342fb-c8ab-4f10-8b53-e49c89d73afa" (UID: "269342fb-c8ab-4f10-8b53-e49c89d73afa"). InnerVolumeSpecName "kube-api-access-bg7j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.765112 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-config" (OuterVolumeSpecName: "config") pod "269342fb-c8ab-4f10-8b53-e49c89d73afa" (UID: "269342fb-c8ab-4f10-8b53-e49c89d73afa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.765572 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "269342fb-c8ab-4f10-8b53-e49c89d73afa" (UID: "269342fb-c8ab-4f10-8b53-e49c89d73afa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.766066 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "269342fb-c8ab-4f10-8b53-e49c89d73afa" (UID: "269342fb-c8ab-4f10-8b53-e49c89d73afa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.768204 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "269342fb-c8ab-4f10-8b53-e49c89d73afa" (UID: "269342fb-c8ab-4f10-8b53-e49c89d73afa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.824874 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.824907 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg7j8\" (UniqueName: \"kubernetes.io/projected/269342fb-c8ab-4f10-8b53-e49c89d73afa-kube-api-access-bg7j8\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.824935 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.824943 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:11 crc kubenswrapper[4781]: I1208 20:23:11.824951 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/269342fb-c8ab-4f10-8b53-e49c89d73afa-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:12 crc kubenswrapper[4781]: I1208 20:23:12.497247 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" event={"ID":"269342fb-c8ab-4f10-8b53-e49c89d73afa","Type":"ContainerDied","Data":"bd7a4d6836dbdadb83e3d9b60997ec6ff98877aae1177c7068103735c6b2e9e6"} Dec 08 20:23:12 crc kubenswrapper[4781]: I1208 20:23:12.497302 4781 scope.go:117] "RemoveContainer" containerID="a0db2c0721bf37b3c44650542a81d1fa594ce2e235bc2ed1eed7d31917f5b6a2" Dec 08 20:23:12 crc kubenswrapper[4781]: I1208 20:23:12.497341 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-pfdb5" Dec 08 20:23:12 crc kubenswrapper[4781]: I1208 20:23:12.526350 4781 scope.go:117] "RemoveContainer" containerID="ba349445ac6dbf4e0bc38cdfc7d92a0f902501ad627806cb431e9cc014144933" Dec 08 20:23:12 crc kubenswrapper[4781]: I1208 20:23:12.527741 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-pfdb5"] Dec 08 20:23:12 crc kubenswrapper[4781]: I1208 20:23:12.536768 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-pfdb5"] Dec 08 20:23:14 crc kubenswrapper[4781]: I1208 20:23:14.142208 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="269342fb-c8ab-4f10-8b53-e49c89d73afa" path="/var/lib/kubelet/pods/269342fb-c8ab-4f10-8b53-e49c89d73afa/volumes" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.082132 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.448866 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-wbjm9"] Dec 08 20:23:16 crc kubenswrapper[4781]: E1208 20:23:16.449504 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="269342fb-c8ab-4f10-8b53-e49c89d73afa" containerName="dnsmasq-dns" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.449525 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="269342fb-c8ab-4f10-8b53-e49c89d73afa" containerName="dnsmasq-dns" Dec 08 20:23:16 crc kubenswrapper[4781]: E1208 20:23:16.449537 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="269342fb-c8ab-4f10-8b53-e49c89d73afa" containerName="init" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.449544 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="269342fb-c8ab-4f10-8b53-e49c89d73afa" containerName="init" Dec 08 20:23:16 crc kubenswrapper[4781]: E1208 20:23:16.449556 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9f5686-c996-46f5-962a-09086bd0d771" containerName="ovn-config" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.449563 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9f5686-c996-46f5-962a-09086bd0d771" containerName="ovn-config" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.449723 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="269342fb-c8ab-4f10-8b53-e49c89d73afa" containerName="dnsmasq-dns" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.449750 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9f5686-c996-46f5-962a-09086bd0d771" containerName="ovn-config" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.450357 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wbjm9" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.457730 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wbjm9"] Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.500182 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b52eea-5a05-48e1-ad62-712af19a2e8c-operator-scripts\") pod \"cinder-db-create-wbjm9\" (UID: \"a8b52eea-5a05-48e1-ad62-712af19a2e8c\") " pod="openstack/cinder-db-create-wbjm9" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.500282 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkdnn\" (UniqueName: \"kubernetes.io/projected/a8b52eea-5a05-48e1-ad62-712af19a2e8c-kube-api-access-dkdnn\") pod \"cinder-db-create-wbjm9\" (UID: \"a8b52eea-5a05-48e1-ad62-712af19a2e8c\") " pod="openstack/cinder-db-create-wbjm9" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.515449 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-6b1d-account-create-update-smzwf"] Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.517392 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6b1d-account-create-update-smzwf" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.520489 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.524216 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6b1d-account-create-update-smzwf"] Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.536975 4781 generic.go:334] "Generic (PLEG): container finished" podID="fa6f1315-4ed1-4ae4-988a-81375adf148b" containerID="699d994318ce90c6ba5dd35df8ce7edef7238aad2c15d3a77cd4c0f41e6e6e72" exitCode=0 Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.537024 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b9gc6" event={"ID":"fa6f1315-4ed1-4ae4-988a-81375adf148b","Type":"ContainerDied","Data":"699d994318ce90c6ba5dd35df8ce7edef7238aad2c15d3a77cd4c0f41e6e6e72"} Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.572033 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-k6p2r"] Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.573131 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-k6p2r" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.575549 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0c06-account-create-update-t4vwj"] Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.576606 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0c06-account-create-update-t4vwj" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.578972 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.606323 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k89rv\" (UniqueName: \"kubernetes.io/projected/89107c30-6f6d-45ff-a4f2-3a956e78c16c-kube-api-access-k89rv\") pod \"cinder-6b1d-account-create-update-smzwf\" (UID: \"89107c30-6f6d-45ff-a4f2-3a956e78c16c\") " pod="openstack/cinder-6b1d-account-create-update-smzwf" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.606387 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkdnn\" (UniqueName: \"kubernetes.io/projected/a8b52eea-5a05-48e1-ad62-712af19a2e8c-kube-api-access-dkdnn\") pod \"cinder-db-create-wbjm9\" (UID: \"a8b52eea-5a05-48e1-ad62-712af19a2e8c\") " pod="openstack/cinder-db-create-wbjm9" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.606660 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89107c30-6f6d-45ff-a4f2-3a956e78c16c-operator-scripts\") pod \"cinder-6b1d-account-create-update-smzwf\" (UID: \"89107c30-6f6d-45ff-a4f2-3a956e78c16c\") " pod="openstack/cinder-6b1d-account-create-update-smzwf" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.606770 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b52eea-5a05-48e1-ad62-712af19a2e8c-operator-scripts\") pod \"cinder-db-create-wbjm9\" (UID: \"a8b52eea-5a05-48e1-ad62-712af19a2e8c\") " pod="openstack/cinder-db-create-wbjm9" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.607047 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-k6p2r"] Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.607737 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b52eea-5a05-48e1-ad62-712af19a2e8c-operator-scripts\") pod \"cinder-db-create-wbjm9\" (UID: \"a8b52eea-5a05-48e1-ad62-712af19a2e8c\") " pod="openstack/cinder-db-create-wbjm9" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.619148 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0c06-account-create-update-t4vwj"] Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.651585 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkdnn\" (UniqueName: \"kubernetes.io/projected/a8b52eea-5a05-48e1-ad62-712af19a2e8c-kube-api-access-dkdnn\") pod \"cinder-db-create-wbjm9\" (UID: \"a8b52eea-5a05-48e1-ad62-712af19a2e8c\") " pod="openstack/cinder-db-create-wbjm9" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.661510 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.707846 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dxqk\" (UniqueName: \"kubernetes.io/projected/104209dd-ab9e-4353-8a27-91f71e6ce510-kube-api-access-7dxqk\") pod \"barbican-0c06-account-create-update-t4vwj\" (UID: \"104209dd-ab9e-4353-8a27-91f71e6ce510\") " pod="openstack/barbican-0c06-account-create-update-t4vwj" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.708318 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k89rv\" (UniqueName: \"kubernetes.io/projected/89107c30-6f6d-45ff-a4f2-3a956e78c16c-kube-api-access-k89rv\") pod \"cinder-6b1d-account-create-update-smzwf\" (UID: \"89107c30-6f6d-45ff-a4f2-3a956e78c16c\") " pod="openstack/cinder-6b1d-account-create-update-smzwf" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.708410 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63df1682-24c8-47c7-9140-8ec51934bd3c-operator-scripts\") pod \"barbican-db-create-k6p2r\" (UID: \"63df1682-24c8-47c7-9140-8ec51934bd3c\") " pod="openstack/barbican-db-create-k6p2r" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.708448 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvh9v\" (UniqueName: \"kubernetes.io/projected/63df1682-24c8-47c7-9140-8ec51934bd3c-kube-api-access-rvh9v\") pod \"barbican-db-create-k6p2r\" (UID: \"63df1682-24c8-47c7-9140-8ec51934bd3c\") " pod="openstack/barbican-db-create-k6p2r" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.708496 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89107c30-6f6d-45ff-a4f2-3a956e78c16c-operator-scripts\") pod \"cinder-6b1d-account-create-update-smzwf\" (UID: \"89107c30-6f6d-45ff-a4f2-3a956e78c16c\") " pod="openstack/cinder-6b1d-account-create-update-smzwf" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.708547 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/104209dd-ab9e-4353-8a27-91f71e6ce510-operator-scripts\") pod \"barbican-0c06-account-create-update-t4vwj\" (UID: \"104209dd-ab9e-4353-8a27-91f71e6ce510\") " pod="openstack/barbican-0c06-account-create-update-t4vwj" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.709112 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89107c30-6f6d-45ff-a4f2-3a956e78c16c-operator-scripts\") pod \"cinder-6b1d-account-create-update-smzwf\" (UID: \"89107c30-6f6d-45ff-a4f2-3a956e78c16c\") " pod="openstack/cinder-6b1d-account-create-update-smzwf" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.743641 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k89rv\" (UniqueName: \"kubernetes.io/projected/89107c30-6f6d-45ff-a4f2-3a956e78c16c-kube-api-access-k89rv\") pod \"cinder-6b1d-account-create-update-smzwf\" (UID: \"89107c30-6f6d-45ff-a4f2-3a956e78c16c\") " pod="openstack/cinder-6b1d-account-create-update-smzwf" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.779085 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c18b-account-create-update-vmjzw"] Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.780430 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c18b-account-create-update-vmjzw" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.783310 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.785932 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wbjm9" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.791782 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-tbszx"] Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.793157 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tbszx" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.804501 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c18b-account-create-update-vmjzw"] Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.809962 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/104209dd-ab9e-4353-8a27-91f71e6ce510-operator-scripts\") pod \"barbican-0c06-account-create-update-t4vwj\" (UID: \"104209dd-ab9e-4353-8a27-91f71e6ce510\") " pod="openstack/barbican-0c06-account-create-update-t4vwj" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.810070 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dxqk\" (UniqueName: \"kubernetes.io/projected/104209dd-ab9e-4353-8a27-91f71e6ce510-kube-api-access-7dxqk\") pod \"barbican-0c06-account-create-update-t4vwj\" (UID: \"104209dd-ab9e-4353-8a27-91f71e6ce510\") " pod="openstack/barbican-0c06-account-create-update-t4vwj" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.810243 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63df1682-24c8-47c7-9140-8ec51934bd3c-operator-scripts\") pod \"barbican-db-create-k6p2r\" (UID: \"63df1682-24c8-47c7-9140-8ec51934bd3c\") " pod="openstack/barbican-db-create-k6p2r" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.810384 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvh9v\" (UniqueName: \"kubernetes.io/projected/63df1682-24c8-47c7-9140-8ec51934bd3c-kube-api-access-rvh9v\") pod \"barbican-db-create-k6p2r\" (UID: \"63df1682-24c8-47c7-9140-8ec51934bd3c\") " pod="openstack/barbican-db-create-k6p2r" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.812159 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63df1682-24c8-47c7-9140-8ec51934bd3c-operator-scripts\") pod \"barbican-db-create-k6p2r\" (UID: \"63df1682-24c8-47c7-9140-8ec51934bd3c\") " pod="openstack/barbican-db-create-k6p2r" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.814218 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/104209dd-ab9e-4353-8a27-91f71e6ce510-operator-scripts\") pod \"barbican-0c06-account-create-update-t4vwj\" (UID: \"104209dd-ab9e-4353-8a27-91f71e6ce510\") " pod="openstack/barbican-0c06-account-create-update-t4vwj" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.839015 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6b1d-account-create-update-smzwf" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.840673 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvh9v\" (UniqueName: \"kubernetes.io/projected/63df1682-24c8-47c7-9140-8ec51934bd3c-kube-api-access-rvh9v\") pod \"barbican-db-create-k6p2r\" (UID: \"63df1682-24c8-47c7-9140-8ec51934bd3c\") " pod="openstack/barbican-db-create-k6p2r" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.841610 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dxqk\" (UniqueName: \"kubernetes.io/projected/104209dd-ab9e-4353-8a27-91f71e6ce510-kube-api-access-7dxqk\") pod \"barbican-0c06-account-create-update-t4vwj\" (UID: \"104209dd-ab9e-4353-8a27-91f71e6ce510\") " pod="openstack/barbican-0c06-account-create-update-t4vwj" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.851777 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tbszx"] Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.871578 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-9r6k2"] Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.873057 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9r6k2" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.877165 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4bstc" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.877368 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.877504 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.879087 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.881953 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9r6k2"] Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.895537 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-k6p2r" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.912161 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf-operator-scripts\") pod \"neutron-db-create-tbszx\" (UID: \"ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf\") " pod="openstack/neutron-db-create-tbszx" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.912326 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qncw\" (UniqueName: \"kubernetes.io/projected/2759aaa8-f901-4a75-9341-5defd0024b8e-kube-api-access-9qncw\") pod \"keystone-db-sync-9r6k2\" (UID: \"2759aaa8-f901-4a75-9341-5defd0024b8e\") " pod="openstack/keystone-db-sync-9r6k2" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.912471 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdc65348-cbcf-4d61-86b1-3ce8584964f3-operator-scripts\") pod \"neutron-c18b-account-create-update-vmjzw\" (UID: \"fdc65348-cbcf-4d61-86b1-3ce8584964f3\") " pod="openstack/neutron-c18b-account-create-update-vmjzw" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.912572 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2759aaa8-f901-4a75-9341-5defd0024b8e-combined-ca-bundle\") pod \"keystone-db-sync-9r6k2\" (UID: \"2759aaa8-f901-4a75-9341-5defd0024b8e\") " pod="openstack/keystone-db-sync-9r6k2" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.912609 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czmdd\" (UniqueName: \"kubernetes.io/projected/ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf-kube-api-access-czmdd\") pod \"neutron-db-create-tbszx\" (UID: \"ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf\") " pod="openstack/neutron-db-create-tbszx" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.912633 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdr48\" (UniqueName: \"kubernetes.io/projected/fdc65348-cbcf-4d61-86b1-3ce8584964f3-kube-api-access-qdr48\") pod \"neutron-c18b-account-create-update-vmjzw\" (UID: \"fdc65348-cbcf-4d61-86b1-3ce8584964f3\") " pod="openstack/neutron-c18b-account-create-update-vmjzw" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.912680 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2759aaa8-f901-4a75-9341-5defd0024b8e-config-data\") pod \"keystone-db-sync-9r6k2\" (UID: \"2759aaa8-f901-4a75-9341-5defd0024b8e\") " pod="openstack/keystone-db-sync-9r6k2" Dec 08 20:23:16 crc kubenswrapper[4781]: I1208 20:23:16.914310 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0c06-account-create-update-t4vwj" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.013628 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdc65348-cbcf-4d61-86b1-3ce8584964f3-operator-scripts\") pod \"neutron-c18b-account-create-update-vmjzw\" (UID: \"fdc65348-cbcf-4d61-86b1-3ce8584964f3\") " pod="openstack/neutron-c18b-account-create-update-vmjzw" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.013701 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2759aaa8-f901-4a75-9341-5defd0024b8e-combined-ca-bundle\") pod \"keystone-db-sync-9r6k2\" (UID: \"2759aaa8-f901-4a75-9341-5defd0024b8e\") " pod="openstack/keystone-db-sync-9r6k2" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.013729 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czmdd\" (UniqueName: \"kubernetes.io/projected/ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf-kube-api-access-czmdd\") pod \"neutron-db-create-tbszx\" (UID: \"ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf\") " pod="openstack/neutron-db-create-tbszx" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.013747 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdr48\" (UniqueName: \"kubernetes.io/projected/fdc65348-cbcf-4d61-86b1-3ce8584964f3-kube-api-access-qdr48\") pod \"neutron-c18b-account-create-update-vmjzw\" (UID: \"fdc65348-cbcf-4d61-86b1-3ce8584964f3\") " pod="openstack/neutron-c18b-account-create-update-vmjzw" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.013775 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2759aaa8-f901-4a75-9341-5defd0024b8e-config-data\") pod \"keystone-db-sync-9r6k2\" (UID: \"2759aaa8-f901-4a75-9341-5defd0024b8e\") " pod="openstack/keystone-db-sync-9r6k2" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.013821 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf-operator-scripts\") pod \"neutron-db-create-tbszx\" (UID: \"ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf\") " pod="openstack/neutron-db-create-tbszx" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.013858 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qncw\" (UniqueName: \"kubernetes.io/projected/2759aaa8-f901-4a75-9341-5defd0024b8e-kube-api-access-9qncw\") pod \"keystone-db-sync-9r6k2\" (UID: \"2759aaa8-f901-4a75-9341-5defd0024b8e\") " pod="openstack/keystone-db-sync-9r6k2" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.015614 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf-operator-scripts\") pod \"neutron-db-create-tbszx\" (UID: \"ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf\") " pod="openstack/neutron-db-create-tbszx" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.018480 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdc65348-cbcf-4d61-86b1-3ce8584964f3-operator-scripts\") pod \"neutron-c18b-account-create-update-vmjzw\" (UID: \"fdc65348-cbcf-4d61-86b1-3ce8584964f3\") " pod="openstack/neutron-c18b-account-create-update-vmjzw" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.019749 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2759aaa8-f901-4a75-9341-5defd0024b8e-config-data\") pod \"keystone-db-sync-9r6k2\" (UID: \"2759aaa8-f901-4a75-9341-5defd0024b8e\") " pod="openstack/keystone-db-sync-9r6k2" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.027833 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2759aaa8-f901-4a75-9341-5defd0024b8e-combined-ca-bundle\") pod \"keystone-db-sync-9r6k2\" (UID: \"2759aaa8-f901-4a75-9341-5defd0024b8e\") " pod="openstack/keystone-db-sync-9r6k2" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.035352 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czmdd\" (UniqueName: \"kubernetes.io/projected/ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf-kube-api-access-czmdd\") pod \"neutron-db-create-tbszx\" (UID: \"ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf\") " pod="openstack/neutron-db-create-tbszx" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.040521 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdr48\" (UniqueName: \"kubernetes.io/projected/fdc65348-cbcf-4d61-86b1-3ce8584964f3-kube-api-access-qdr48\") pod \"neutron-c18b-account-create-update-vmjzw\" (UID: \"fdc65348-cbcf-4d61-86b1-3ce8584964f3\") " pod="openstack/neutron-c18b-account-create-update-vmjzw" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.047385 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qncw\" (UniqueName: \"kubernetes.io/projected/2759aaa8-f901-4a75-9341-5defd0024b8e-kube-api-access-9qncw\") pod \"keystone-db-sync-9r6k2\" (UID: \"2759aaa8-f901-4a75-9341-5defd0024b8e\") " pod="openstack/keystone-db-sync-9r6k2" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.110602 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c18b-account-create-update-vmjzw" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.211586 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tbszx" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.218874 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9r6k2" Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.585566 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-6b1d-account-create-update-smzwf"] Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.599406 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wbjm9"] Dec 08 20:23:17 crc kubenswrapper[4781]: W1208 20:23:17.601842 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89107c30_6f6d_45ff_a4f2_3a956e78c16c.slice/crio-9fa4e89574efaec90a95e115bbb62683c7558b9800f5af1e8879e3a1fe25bf22 WatchSource:0}: Error finding container 9fa4e89574efaec90a95e115bbb62683c7558b9800f5af1e8879e3a1fe25bf22: Status 404 returned error can't find the container with id 9fa4e89574efaec90a95e115bbb62683c7558b9800f5af1e8879e3a1fe25bf22 Dec 08 20:23:17 crc kubenswrapper[4781]: I1208 20:23:17.850403 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-k6p2r"] Dec 08 20:23:17 crc kubenswrapper[4781]: W1208 20:23:17.862215 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63df1682_24c8_47c7_9140_8ec51934bd3c.slice/crio-c02cbdc0b5d7a63997c153da93d91318a333fb8bd746c6fcf47cf189b9590a87 WatchSource:0}: Error finding container c02cbdc0b5d7a63997c153da93d91318a333fb8bd746c6fcf47cf189b9590a87: Status 404 returned error can't find the container with id c02cbdc0b5d7a63997c153da93d91318a333fb8bd746c6fcf47cf189b9590a87 Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.004382 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0c06-account-create-update-t4vwj"] Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.068291 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c18b-account-create-update-vmjzw"] Dec 08 20:23:18 crc kubenswrapper[4781]: W1208 20:23:18.083131 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod104209dd_ab9e_4353_8a27_91f71e6ce510.slice/crio-009e3316594853deed9d4158f2c91e59e6ff200b568c1433c26bcfbbf5f03054 WatchSource:0}: Error finding container 009e3316594853deed9d4158f2c91e59e6ff200b568c1433c26bcfbbf5f03054: Status 404 returned error can't find the container with id 009e3316594853deed9d4158f2c91e59e6ff200b568c1433c26bcfbbf5f03054 Dec 08 20:23:18 crc kubenswrapper[4781]: W1208 20:23:18.087174 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdc65348_cbcf_4d61_86b1_3ce8584964f3.slice/crio-ece5b9f814a9309c7913c4d9ebf9928bbac14da9a356e8e214aae58a691823bf WatchSource:0}: Error finding container ece5b9f814a9309c7913c4d9ebf9928bbac14da9a356e8e214aae58a691823bf: Status 404 returned error can't find the container with id ece5b9f814a9309c7913c4d9ebf9928bbac14da9a356e8e214aae58a691823bf Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.117290 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tbszx"] Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.194848 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9r6k2"] Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.240700 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b9gc6" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.375143 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-db-sync-config-data\") pod \"fa6f1315-4ed1-4ae4-988a-81375adf148b\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.375433 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lcns\" (UniqueName: \"kubernetes.io/projected/fa6f1315-4ed1-4ae4-988a-81375adf148b-kube-api-access-8lcns\") pod \"fa6f1315-4ed1-4ae4-988a-81375adf148b\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.375614 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-combined-ca-bundle\") pod \"fa6f1315-4ed1-4ae4-988a-81375adf148b\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.375767 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-config-data\") pod \"fa6f1315-4ed1-4ae4-988a-81375adf148b\" (UID: \"fa6f1315-4ed1-4ae4-988a-81375adf148b\") " Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.387164 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "fa6f1315-4ed1-4ae4-988a-81375adf148b" (UID: "fa6f1315-4ed1-4ae4-988a-81375adf148b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.391467 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6f1315-4ed1-4ae4-988a-81375adf148b-kube-api-access-8lcns" (OuterVolumeSpecName: "kube-api-access-8lcns") pod "fa6f1315-4ed1-4ae4-988a-81375adf148b" (UID: "fa6f1315-4ed1-4ae4-988a-81375adf148b"). InnerVolumeSpecName "kube-api-access-8lcns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.442077 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa6f1315-4ed1-4ae4-988a-81375adf148b" (UID: "fa6f1315-4ed1-4ae4-988a-81375adf148b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.478088 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.478127 4781 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.478140 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lcns\" (UniqueName: \"kubernetes.io/projected/fa6f1315-4ed1-4ae4-988a-81375adf148b-kube-api-access-8lcns\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.478600 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-config-data" (OuterVolumeSpecName: "config-data") pod "fa6f1315-4ed1-4ae4-988a-81375adf148b" (UID: "fa6f1315-4ed1-4ae4-988a-81375adf148b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.571970 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-b9gc6" event={"ID":"fa6f1315-4ed1-4ae4-988a-81375adf148b","Type":"ContainerDied","Data":"b997a9cac196cb6e18cf6b915eda87a971c935a7b6f65be53952a19a0970084a"} Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.572026 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b997a9cac196cb6e18cf6b915eda87a971c935a7b6f65be53952a19a0970084a" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.572123 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-b9gc6" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.579972 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa6f1315-4ed1-4ae4-988a-81375adf148b-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.581768 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-k6p2r" event={"ID":"63df1682-24c8-47c7-9140-8ec51934bd3c","Type":"ContainerStarted","Data":"373afcb128a58ba2a88fcd5198d686d3ef5595ac3346557bfb459cb6960fdfaf"} Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.581810 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-k6p2r" event={"ID":"63df1682-24c8-47c7-9140-8ec51934bd3c","Type":"ContainerStarted","Data":"c02cbdc0b5d7a63997c153da93d91318a333fb8bd746c6fcf47cf189b9590a87"} Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.591611 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9r6k2" event={"ID":"2759aaa8-f901-4a75-9341-5defd0024b8e","Type":"ContainerStarted","Data":"1358ec58a2887d216d4b8dd6346f3f0e2349272f2f3722648b30ee760cd7d5db"} Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.598677 4781 generic.go:334] "Generic (PLEG): container finished" podID="a8b52eea-5a05-48e1-ad62-712af19a2e8c" containerID="ae80acc57ad5d9145b7e1568ba9e3bb6261bae1149b3c60277353d3b449226b2" exitCode=0 Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.598836 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wbjm9" event={"ID":"a8b52eea-5a05-48e1-ad62-712af19a2e8c","Type":"ContainerDied","Data":"ae80acc57ad5d9145b7e1568ba9e3bb6261bae1149b3c60277353d3b449226b2"} Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.598956 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wbjm9" event={"ID":"a8b52eea-5a05-48e1-ad62-712af19a2e8c","Type":"ContainerStarted","Data":"1002bdc54a4d0e32068a271a895e57e66087354bc2d16d6a8718d2c150fd7f47"} Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.603475 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tbszx" event={"ID":"ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf","Type":"ContainerStarted","Data":"a567f0a5464df23d7a56b15eac3b74a0a323fd24e917b6ed656f73fc3eef8b6b"} Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.612164 4781 generic.go:334] "Generic (PLEG): container finished" podID="89107c30-6f6d-45ff-a4f2-3a956e78c16c" containerID="03525b46669a0990a51dae4456f5e479895c525c92f468109a0d2c80e1007ee6" exitCode=0 Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.612272 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6b1d-account-create-update-smzwf" event={"ID":"89107c30-6f6d-45ff-a4f2-3a956e78c16c","Type":"ContainerDied","Data":"03525b46669a0990a51dae4456f5e479895c525c92f468109a0d2c80e1007ee6"} Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.612307 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6b1d-account-create-update-smzwf" event={"ID":"89107c30-6f6d-45ff-a4f2-3a956e78c16c","Type":"ContainerStarted","Data":"9fa4e89574efaec90a95e115bbb62683c7558b9800f5af1e8879e3a1fe25bf22"} Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.613292 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-k6p2r" podStartSLOduration=2.613279145 podStartE2EDuration="2.613279145s" podCreationTimestamp="2025-12-08 20:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:23:18.611799283 +0000 UTC m=+1114.763082660" watchObservedRunningTime="2025-12-08 20:23:18.613279145 +0000 UTC m=+1114.764562522" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.627776 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0c06-account-create-update-t4vwj" event={"ID":"104209dd-ab9e-4353-8a27-91f71e6ce510","Type":"ContainerStarted","Data":"644d14bbbacdfc85c71670721a4824ba17f64b2cae83c57042d18c703ea16cf5"} Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.627822 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0c06-account-create-update-t4vwj" event={"ID":"104209dd-ab9e-4353-8a27-91f71e6ce510","Type":"ContainerStarted","Data":"009e3316594853deed9d4158f2c91e59e6ff200b568c1433c26bcfbbf5f03054"} Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.631101 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c18b-account-create-update-vmjzw" event={"ID":"fdc65348-cbcf-4d61-86b1-3ce8584964f3","Type":"ContainerStarted","Data":"ece5b9f814a9309c7913c4d9ebf9928bbac14da9a356e8e214aae58a691823bf"} Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.640988 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-tbszx" podStartSLOduration=2.640967071 podStartE2EDuration="2.640967071s" podCreationTimestamp="2025-12-08 20:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:23:18.626358731 +0000 UTC m=+1114.777642108" watchObservedRunningTime="2025-12-08 20:23:18.640967071 +0000 UTC m=+1114.792250448" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.720608 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-0c06-account-create-update-t4vwj" podStartSLOduration=2.7205921269999997 podStartE2EDuration="2.720592127s" podCreationTimestamp="2025-12-08 20:23:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:23:18.707477161 +0000 UTC m=+1114.858760528" watchObservedRunningTime="2025-12-08 20:23:18.720592127 +0000 UTC m=+1114.871875504" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.944226 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d879466b9-tx84q"] Dec 08 20:23:18 crc kubenswrapper[4781]: E1208 20:23:18.944874 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6f1315-4ed1-4ae4-988a-81375adf148b" containerName="glance-db-sync" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.944894 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6f1315-4ed1-4ae4-988a-81375adf148b" containerName="glance-db-sync" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.945101 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6f1315-4ed1-4ae4-988a-81375adf148b" containerName="glance-db-sync" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.946174 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:18 crc kubenswrapper[4781]: I1208 20:23:18.965012 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d879466b9-tx84q"] Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.090584 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-dns-swift-storage-0\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.090654 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-config\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.090674 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-ovsdbserver-sb\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.090691 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbtjj\" (UniqueName: \"kubernetes.io/projected/1b14a49e-1bf0-4914-9625-14c58c351d6a-kube-api-access-xbtjj\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.090751 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-ovsdbserver-nb\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.090893 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-dns-svc\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.193530 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-ovsdbserver-nb\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.193645 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-dns-svc\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.193696 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-dns-swift-storage-0\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.193761 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-config\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.193787 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-ovsdbserver-sb\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.193818 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbtjj\" (UniqueName: \"kubernetes.io/projected/1b14a49e-1bf0-4914-9625-14c58c351d6a-kube-api-access-xbtjj\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.194440 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-ovsdbserver-nb\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.194485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-dns-svc\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.195074 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-config\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.195269 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-ovsdbserver-sb\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.195880 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-dns-swift-storage-0\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.237872 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbtjj\" (UniqueName: \"kubernetes.io/projected/1b14a49e-1bf0-4914-9625-14c58c351d6a-kube-api-access-xbtjj\") pod \"dnsmasq-dns-d879466b9-tx84q\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.271601 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.640973 4781 generic.go:334] "Generic (PLEG): container finished" podID="63df1682-24c8-47c7-9140-8ec51934bd3c" containerID="373afcb128a58ba2a88fcd5198d686d3ef5595ac3346557bfb459cb6960fdfaf" exitCode=0 Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.641114 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-k6p2r" event={"ID":"63df1682-24c8-47c7-9140-8ec51934bd3c","Type":"ContainerDied","Data":"373afcb128a58ba2a88fcd5198d686d3ef5595ac3346557bfb459cb6960fdfaf"} Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.643110 4781 generic.go:334] "Generic (PLEG): container finished" podID="ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf" containerID="14dd33fc2715cde77428d8609010f9fc2ac3ecdf41ebd2365f9ffa29ad9d235f" exitCode=0 Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.643298 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tbszx" event={"ID":"ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf","Type":"ContainerDied","Data":"14dd33fc2715cde77428d8609010f9fc2ac3ecdf41ebd2365f9ffa29ad9d235f"} Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.645905 4781 generic.go:334] "Generic (PLEG): container finished" podID="104209dd-ab9e-4353-8a27-91f71e6ce510" containerID="644d14bbbacdfc85c71670721a4824ba17f64b2cae83c57042d18c703ea16cf5" exitCode=0 Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.645989 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0c06-account-create-update-t4vwj" event={"ID":"104209dd-ab9e-4353-8a27-91f71e6ce510","Type":"ContainerDied","Data":"644d14bbbacdfc85c71670721a4824ba17f64b2cae83c57042d18c703ea16cf5"} Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.647347 4781 generic.go:334] "Generic (PLEG): container finished" podID="fdc65348-cbcf-4d61-86b1-3ce8584964f3" containerID="fb0b2141e92491eafdc77c282dba8daee8151cf52a1e3db3ca7895de26ab804a" exitCode=0 Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.647580 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c18b-account-create-update-vmjzw" event={"ID":"fdc65348-cbcf-4d61-86b1-3ce8584964f3","Type":"ContainerDied","Data":"fb0b2141e92491eafdc77c282dba8daee8151cf52a1e3db3ca7895de26ab804a"} Dec 08 20:23:19 crc kubenswrapper[4781]: I1208 20:23:19.776221 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d879466b9-tx84q"] Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.064770 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6b1d-account-create-update-smzwf" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.073038 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wbjm9" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.237380 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b52eea-5a05-48e1-ad62-712af19a2e8c-operator-scripts\") pod \"a8b52eea-5a05-48e1-ad62-712af19a2e8c\" (UID: \"a8b52eea-5a05-48e1-ad62-712af19a2e8c\") " Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.237769 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkdnn\" (UniqueName: \"kubernetes.io/projected/a8b52eea-5a05-48e1-ad62-712af19a2e8c-kube-api-access-dkdnn\") pod \"a8b52eea-5a05-48e1-ad62-712af19a2e8c\" (UID: \"a8b52eea-5a05-48e1-ad62-712af19a2e8c\") " Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.237813 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k89rv\" (UniqueName: \"kubernetes.io/projected/89107c30-6f6d-45ff-a4f2-3a956e78c16c-kube-api-access-k89rv\") pod \"89107c30-6f6d-45ff-a4f2-3a956e78c16c\" (UID: \"89107c30-6f6d-45ff-a4f2-3a956e78c16c\") " Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.237894 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8b52eea-5a05-48e1-ad62-712af19a2e8c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8b52eea-5a05-48e1-ad62-712af19a2e8c" (UID: "a8b52eea-5a05-48e1-ad62-712af19a2e8c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.237988 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89107c30-6f6d-45ff-a4f2-3a956e78c16c-operator-scripts\") pod \"89107c30-6f6d-45ff-a4f2-3a956e78c16c\" (UID: \"89107c30-6f6d-45ff-a4f2-3a956e78c16c\") " Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.238489 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8b52eea-5a05-48e1-ad62-712af19a2e8c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.238774 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89107c30-6f6d-45ff-a4f2-3a956e78c16c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "89107c30-6f6d-45ff-a4f2-3a956e78c16c" (UID: "89107c30-6f6d-45ff-a4f2-3a956e78c16c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.242539 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8b52eea-5a05-48e1-ad62-712af19a2e8c-kube-api-access-dkdnn" (OuterVolumeSpecName: "kube-api-access-dkdnn") pod "a8b52eea-5a05-48e1-ad62-712af19a2e8c" (UID: "a8b52eea-5a05-48e1-ad62-712af19a2e8c"). InnerVolumeSpecName "kube-api-access-dkdnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.242972 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89107c30-6f6d-45ff-a4f2-3a956e78c16c-kube-api-access-k89rv" (OuterVolumeSpecName: "kube-api-access-k89rv") pod "89107c30-6f6d-45ff-a4f2-3a956e78c16c" (UID: "89107c30-6f6d-45ff-a4f2-3a956e78c16c"). InnerVolumeSpecName "kube-api-access-k89rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.339734 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89107c30-6f6d-45ff-a4f2-3a956e78c16c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.339768 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkdnn\" (UniqueName: \"kubernetes.io/projected/a8b52eea-5a05-48e1-ad62-712af19a2e8c-kube-api-access-dkdnn\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.339780 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k89rv\" (UniqueName: \"kubernetes.io/projected/89107c30-6f6d-45ff-a4f2-3a956e78c16c-kube-api-access-k89rv\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.658379 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wbjm9" event={"ID":"a8b52eea-5a05-48e1-ad62-712af19a2e8c","Type":"ContainerDied","Data":"1002bdc54a4d0e32068a271a895e57e66087354bc2d16d6a8718d2c150fd7f47"} Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.658430 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1002bdc54a4d0e32068a271a895e57e66087354bc2d16d6a8718d2c150fd7f47" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.658509 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wbjm9" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.666571 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-6b1d-account-create-update-smzwf" event={"ID":"89107c30-6f6d-45ff-a4f2-3a956e78c16c","Type":"ContainerDied","Data":"9fa4e89574efaec90a95e115bbb62683c7558b9800f5af1e8879e3a1fe25bf22"} Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.666616 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fa4e89574efaec90a95e115bbb62683c7558b9800f5af1e8879e3a1fe25bf22" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.666656 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-6b1d-account-create-update-smzwf" Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.668427 4781 generic.go:334] "Generic (PLEG): container finished" podID="1b14a49e-1bf0-4914-9625-14c58c351d6a" containerID="f99257e52a618c9d6669a0a855d33f2421c2cb0209bb00953b89a3073be5f944" exitCode=0 Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.669125 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d879466b9-tx84q" event={"ID":"1b14a49e-1bf0-4914-9625-14c58c351d6a","Type":"ContainerDied","Data":"f99257e52a618c9d6669a0a855d33f2421c2cb0209bb00953b89a3073be5f944"} Dec 08 20:23:20 crc kubenswrapper[4781]: I1208 20:23:20.669181 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d879466b9-tx84q" event={"ID":"1b14a49e-1bf0-4914-9625-14c58c351d6a","Type":"ContainerStarted","Data":"2f2c4c073898332e80f6f0d1db091b2d6746e3a72dd86bd06af0f6798f26109f"} Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.363113 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tbszx" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.383569 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf-operator-scripts\") pod \"ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf\" (UID: \"ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf\") " Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.383643 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czmdd\" (UniqueName: \"kubernetes.io/projected/ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf-kube-api-access-czmdd\") pod \"ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf\" (UID: \"ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf\") " Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.384801 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf" (UID: "ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.408162 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf-kube-api-access-czmdd" (OuterVolumeSpecName: "kube-api-access-czmdd") pod "ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf" (UID: "ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf"). InnerVolumeSpecName "kube-api-access-czmdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.485433 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.485467 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czmdd\" (UniqueName: \"kubernetes.io/projected/ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf-kube-api-access-czmdd\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.512696 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0c06-account-create-update-t4vwj" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.517307 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-k6p2r" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.529227 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c18b-account-create-update-vmjzw" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.586990 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63df1682-24c8-47c7-9140-8ec51934bd3c-operator-scripts\") pod \"63df1682-24c8-47c7-9140-8ec51934bd3c\" (UID: \"63df1682-24c8-47c7-9140-8ec51934bd3c\") " Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.587386 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dxqk\" (UniqueName: \"kubernetes.io/projected/104209dd-ab9e-4353-8a27-91f71e6ce510-kube-api-access-7dxqk\") pod \"104209dd-ab9e-4353-8a27-91f71e6ce510\" (UID: \"104209dd-ab9e-4353-8a27-91f71e6ce510\") " Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.587444 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvh9v\" (UniqueName: \"kubernetes.io/projected/63df1682-24c8-47c7-9140-8ec51934bd3c-kube-api-access-rvh9v\") pod \"63df1682-24c8-47c7-9140-8ec51934bd3c\" (UID: \"63df1682-24c8-47c7-9140-8ec51934bd3c\") " Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.587502 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/104209dd-ab9e-4353-8a27-91f71e6ce510-operator-scripts\") pod \"104209dd-ab9e-4353-8a27-91f71e6ce510\" (UID: \"104209dd-ab9e-4353-8a27-91f71e6ce510\") " Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.587613 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdc65348-cbcf-4d61-86b1-3ce8584964f3-operator-scripts\") pod \"fdc65348-cbcf-4d61-86b1-3ce8584964f3\" (UID: \"fdc65348-cbcf-4d61-86b1-3ce8584964f3\") " Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.587630 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63df1682-24c8-47c7-9140-8ec51934bd3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63df1682-24c8-47c7-9140-8ec51934bd3c" (UID: "63df1682-24c8-47c7-9140-8ec51934bd3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.587657 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdr48\" (UniqueName: \"kubernetes.io/projected/fdc65348-cbcf-4d61-86b1-3ce8584964f3-kube-api-access-qdr48\") pod \"fdc65348-cbcf-4d61-86b1-3ce8584964f3\" (UID: \"fdc65348-cbcf-4d61-86b1-3ce8584964f3\") " Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.587870 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/104209dd-ab9e-4353-8a27-91f71e6ce510-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "104209dd-ab9e-4353-8a27-91f71e6ce510" (UID: "104209dd-ab9e-4353-8a27-91f71e6ce510"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.588058 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63df1682-24c8-47c7-9140-8ec51934bd3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.588076 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/104209dd-ab9e-4353-8a27-91f71e6ce510-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.588293 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdc65348-cbcf-4d61-86b1-3ce8584964f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fdc65348-cbcf-4d61-86b1-3ce8584964f3" (UID: "fdc65348-cbcf-4d61-86b1-3ce8584964f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.592012 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/104209dd-ab9e-4353-8a27-91f71e6ce510-kube-api-access-7dxqk" (OuterVolumeSpecName: "kube-api-access-7dxqk") pod "104209dd-ab9e-4353-8a27-91f71e6ce510" (UID: "104209dd-ab9e-4353-8a27-91f71e6ce510"). InnerVolumeSpecName "kube-api-access-7dxqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.593619 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdc65348-cbcf-4d61-86b1-3ce8584964f3-kube-api-access-qdr48" (OuterVolumeSpecName: "kube-api-access-qdr48") pod "fdc65348-cbcf-4d61-86b1-3ce8584964f3" (UID: "fdc65348-cbcf-4d61-86b1-3ce8584964f3"). InnerVolumeSpecName "kube-api-access-qdr48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.595600 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63df1682-24c8-47c7-9140-8ec51934bd3c-kube-api-access-rvh9v" (OuterVolumeSpecName: "kube-api-access-rvh9v") pod "63df1682-24c8-47c7-9140-8ec51934bd3c" (UID: "63df1682-24c8-47c7-9140-8ec51934bd3c"). InnerVolumeSpecName "kube-api-access-rvh9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.683164 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0c06-account-create-update-t4vwj" event={"ID":"104209dd-ab9e-4353-8a27-91f71e6ce510","Type":"ContainerDied","Data":"009e3316594853deed9d4158f2c91e59e6ff200b568c1433c26bcfbbf5f03054"} Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.683207 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="009e3316594853deed9d4158f2c91e59e6ff200b568c1433c26bcfbbf5f03054" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.683289 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0c06-account-create-update-t4vwj" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.686482 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c18b-account-create-update-vmjzw" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.686474 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c18b-account-create-update-vmjzw" event={"ID":"fdc65348-cbcf-4d61-86b1-3ce8584964f3","Type":"ContainerDied","Data":"ece5b9f814a9309c7913c4d9ebf9928bbac14da9a356e8e214aae58a691823bf"} Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.686541 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ece5b9f814a9309c7913c4d9ebf9928bbac14da9a356e8e214aae58a691823bf" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.688930 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dxqk\" (UniqueName: \"kubernetes.io/projected/104209dd-ab9e-4353-8a27-91f71e6ce510-kube-api-access-7dxqk\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.688950 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvh9v\" (UniqueName: \"kubernetes.io/projected/63df1682-24c8-47c7-9140-8ec51934bd3c-kube-api-access-rvh9v\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.688960 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdc65348-cbcf-4d61-86b1-3ce8584964f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.688969 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdr48\" (UniqueName: \"kubernetes.io/projected/fdc65348-cbcf-4d61-86b1-3ce8584964f3-kube-api-access-qdr48\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.695888 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-k6p2r" event={"ID":"63df1682-24c8-47c7-9140-8ec51934bd3c","Type":"ContainerDied","Data":"c02cbdc0b5d7a63997c153da93d91318a333fb8bd746c6fcf47cf189b9590a87"} Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.695901 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-k6p2r" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.695988 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c02cbdc0b5d7a63997c153da93d91318a333fb8bd746c6fcf47cf189b9590a87" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.700513 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d879466b9-tx84q" event={"ID":"1b14a49e-1bf0-4914-9625-14c58c351d6a","Type":"ContainerStarted","Data":"5f7d25fcf036d3edffa6034a78fbd156419592ca8440b9ff8c70a8e1591f570e"} Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.701347 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.719543 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tbszx" event={"ID":"ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf","Type":"ContainerDied","Data":"a567f0a5464df23d7a56b15eac3b74a0a323fd24e917b6ed656f73fc3eef8b6b"} Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.719608 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a567f0a5464df23d7a56b15eac3b74a0a323fd24e917b6ed656f73fc3eef8b6b" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.719705 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tbszx" Dec 08 20:23:21 crc kubenswrapper[4781]: I1208 20:23:21.729054 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d879466b9-tx84q" podStartSLOduration=3.729035388 podStartE2EDuration="3.729035388s" podCreationTimestamp="2025-12-08 20:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:23:21.721958615 +0000 UTC m=+1117.873241992" watchObservedRunningTime="2025-12-08 20:23:21.729035388 +0000 UTC m=+1117.880318765" Dec 08 20:23:24 crc kubenswrapper[4781]: I1208 20:23:24.755576 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9r6k2" event={"ID":"2759aaa8-f901-4a75-9341-5defd0024b8e","Type":"ContainerStarted","Data":"05b3fdd09127636c6c1ef51e29d6eae73de8c801d3eff8f636bdf079c37fad52"} Dec 08 20:23:24 crc kubenswrapper[4781]: I1208 20:23:24.772389 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-9r6k2" podStartSLOduration=2.51738965 podStartE2EDuration="8.772371161s" podCreationTimestamp="2025-12-08 20:23:16 +0000 UTC" firstStartedPulling="2025-12-08 20:23:18.211612269 +0000 UTC m=+1114.362895646" lastFinishedPulling="2025-12-08 20:23:24.46659378 +0000 UTC m=+1120.617877157" observedRunningTime="2025-12-08 20:23:24.771953199 +0000 UTC m=+1120.923236576" watchObservedRunningTime="2025-12-08 20:23:24.772371161 +0000 UTC m=+1120.923654538" Dec 08 20:23:27 crc kubenswrapper[4781]: I1208 20:23:27.783759 4781 generic.go:334] "Generic (PLEG): container finished" podID="2759aaa8-f901-4a75-9341-5defd0024b8e" containerID="05b3fdd09127636c6c1ef51e29d6eae73de8c801d3eff8f636bdf079c37fad52" exitCode=0 Dec 08 20:23:27 crc kubenswrapper[4781]: I1208 20:23:27.783802 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9r6k2" event={"ID":"2759aaa8-f901-4a75-9341-5defd0024b8e","Type":"ContainerDied","Data":"05b3fdd09127636c6c1ef51e29d6eae73de8c801d3eff8f636bdf079c37fad52"} Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.096409 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9r6k2" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.255792 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2759aaa8-f901-4a75-9341-5defd0024b8e-config-data\") pod \"2759aaa8-f901-4a75-9341-5defd0024b8e\" (UID: \"2759aaa8-f901-4a75-9341-5defd0024b8e\") " Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.256342 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2759aaa8-f901-4a75-9341-5defd0024b8e-combined-ca-bundle\") pod \"2759aaa8-f901-4a75-9341-5defd0024b8e\" (UID: \"2759aaa8-f901-4a75-9341-5defd0024b8e\") " Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.256388 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qncw\" (UniqueName: \"kubernetes.io/projected/2759aaa8-f901-4a75-9341-5defd0024b8e-kube-api-access-9qncw\") pod \"2759aaa8-f901-4a75-9341-5defd0024b8e\" (UID: \"2759aaa8-f901-4a75-9341-5defd0024b8e\") " Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.261071 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2759aaa8-f901-4a75-9341-5defd0024b8e-kube-api-access-9qncw" (OuterVolumeSpecName: "kube-api-access-9qncw") pod "2759aaa8-f901-4a75-9341-5defd0024b8e" (UID: "2759aaa8-f901-4a75-9341-5defd0024b8e"). InnerVolumeSpecName "kube-api-access-9qncw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.274140 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.279051 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2759aaa8-f901-4a75-9341-5defd0024b8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2759aaa8-f901-4a75-9341-5defd0024b8e" (UID: "2759aaa8-f901-4a75-9341-5defd0024b8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.297252 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2759aaa8-f901-4a75-9341-5defd0024b8e-config-data" (OuterVolumeSpecName: "config-data") pod "2759aaa8-f901-4a75-9341-5defd0024b8e" (UID: "2759aaa8-f901-4a75-9341-5defd0024b8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.358016 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2759aaa8-f901-4a75-9341-5defd0024b8e-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.358062 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2759aaa8-f901-4a75-9341-5defd0024b8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.358078 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qncw\" (UniqueName: \"kubernetes.io/projected/2759aaa8-f901-4a75-9341-5defd0024b8e-kube-api-access-9qncw\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.365545 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-thdb8"] Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.365813 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" podUID="65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" containerName="dnsmasq-dns" containerID="cri-o://abc88af504abbc2dfab78c44b2fd97db55c6d667b635731f22ad2458fb465870" gracePeriod=10 Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.698348 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.784566 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-dns-svc\") pod \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.784633 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-config\") pod \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.784697 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-ovsdbserver-nb\") pod \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.784730 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-ovsdbserver-sb\") pod \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.784851 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-dns-swift-storage-0\") pod \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.784934 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlmn9\" (UniqueName: \"kubernetes.io/projected/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-kube-api-access-vlmn9\") pod \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\" (UID: \"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0\") " Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.792155 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-kube-api-access-vlmn9" (OuterVolumeSpecName: "kube-api-access-vlmn9") pod "65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" (UID: "65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0"). InnerVolumeSpecName "kube-api-access-vlmn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.801032 4781 generic.go:334] "Generic (PLEG): container finished" podID="65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" containerID="abc88af504abbc2dfab78c44b2fd97db55c6d667b635731f22ad2458fb465870" exitCode=0 Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.801099 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" event={"ID":"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0","Type":"ContainerDied","Data":"abc88af504abbc2dfab78c44b2fd97db55c6d667b635731f22ad2458fb465870"} Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.801128 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" event={"ID":"65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0","Type":"ContainerDied","Data":"111c34aed60f1c095052d202265e29ee0eaa2b18dc2eb31ddbf33b7e584e80b7"} Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.801147 4781 scope.go:117] "RemoveContainer" containerID="abc88af504abbc2dfab78c44b2fd97db55c6d667b635731f22ad2458fb465870" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.801267 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864b648dc7-thdb8" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.806166 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9r6k2" event={"ID":"2759aaa8-f901-4a75-9341-5defd0024b8e","Type":"ContainerDied","Data":"1358ec58a2887d216d4b8dd6346f3f0e2349272f2f3722648b30ee760cd7d5db"} Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.806223 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1358ec58a2887d216d4b8dd6346f3f0e2349272f2f3722648b30ee760cd7d5db" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.806313 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9r6k2" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.829791 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" (UID: "65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.832261 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" (UID: "65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.833536 4781 scope.go:117] "RemoveContainer" containerID="f6bf3a23dbe8bc5651b9e469feb229bb8066433f75d90cd8420d4a51f1fc73e3" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.838953 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" (UID: "65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.848489 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-config" (OuterVolumeSpecName: "config") pod "65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" (UID: "65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.853146 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" (UID: "65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.858492 4781 scope.go:117] "RemoveContainer" containerID="abc88af504abbc2dfab78c44b2fd97db55c6d667b635731f22ad2458fb465870" Dec 08 20:23:29 crc kubenswrapper[4781]: E1208 20:23:29.859052 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc88af504abbc2dfab78c44b2fd97db55c6d667b635731f22ad2458fb465870\": container with ID starting with abc88af504abbc2dfab78c44b2fd97db55c6d667b635731f22ad2458fb465870 not found: ID does not exist" containerID="abc88af504abbc2dfab78c44b2fd97db55c6d667b635731f22ad2458fb465870" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.859095 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc88af504abbc2dfab78c44b2fd97db55c6d667b635731f22ad2458fb465870"} err="failed to get container status \"abc88af504abbc2dfab78c44b2fd97db55c6d667b635731f22ad2458fb465870\": rpc error: code = NotFound desc = could not find container \"abc88af504abbc2dfab78c44b2fd97db55c6d667b635731f22ad2458fb465870\": container with ID starting with abc88af504abbc2dfab78c44b2fd97db55c6d667b635731f22ad2458fb465870 not found: ID does not exist" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.859118 4781 scope.go:117] "RemoveContainer" containerID="f6bf3a23dbe8bc5651b9e469feb229bb8066433f75d90cd8420d4a51f1fc73e3" Dec 08 20:23:29 crc kubenswrapper[4781]: E1208 20:23:29.859542 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6bf3a23dbe8bc5651b9e469feb229bb8066433f75d90cd8420d4a51f1fc73e3\": container with ID starting with f6bf3a23dbe8bc5651b9e469feb229bb8066433f75d90cd8420d4a51f1fc73e3 not found: ID does not exist" containerID="f6bf3a23dbe8bc5651b9e469feb229bb8066433f75d90cd8420d4a51f1fc73e3" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.859562 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6bf3a23dbe8bc5651b9e469feb229bb8066433f75d90cd8420d4a51f1fc73e3"} err="failed to get container status \"f6bf3a23dbe8bc5651b9e469feb229bb8066433f75d90cd8420d4a51f1fc73e3\": rpc error: code = NotFound desc = could not find container \"f6bf3a23dbe8bc5651b9e469feb229bb8066433f75d90cd8420d4a51f1fc73e3\": container with ID starting with f6bf3a23dbe8bc5651b9e469feb229bb8066433f75d90cd8420d4a51f1fc73e3 not found: ID does not exist" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.886382 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.886414 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.886425 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.886434 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.886446 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:29 crc kubenswrapper[4781]: I1208 20:23:29.886454 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlmn9\" (UniqueName: \"kubernetes.io/projected/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0-kube-api-access-vlmn9\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.080902 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5689975857-zcq4k"] Dec 08 20:23:30 crc kubenswrapper[4781]: E1208 20:23:30.081381 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2759aaa8-f901-4a75-9341-5defd0024b8e" containerName="keystone-db-sync" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081403 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2759aaa8-f901-4a75-9341-5defd0024b8e" containerName="keystone-db-sync" Dec 08 20:23:30 crc kubenswrapper[4781]: E1208 20:23:30.081416 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8b52eea-5a05-48e1-ad62-712af19a2e8c" containerName="mariadb-database-create" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081424 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b52eea-5a05-48e1-ad62-712af19a2e8c" containerName="mariadb-database-create" Dec 08 20:23:30 crc kubenswrapper[4781]: E1208 20:23:30.081439 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" containerName="init" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081447 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" containerName="init" Dec 08 20:23:30 crc kubenswrapper[4781]: E1208 20:23:30.081464 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63df1682-24c8-47c7-9140-8ec51934bd3c" containerName="mariadb-database-create" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081471 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="63df1682-24c8-47c7-9140-8ec51934bd3c" containerName="mariadb-database-create" Dec 08 20:23:30 crc kubenswrapper[4781]: E1208 20:23:30.081487 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdc65348-cbcf-4d61-86b1-3ce8584964f3" containerName="mariadb-account-create-update" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081495 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdc65348-cbcf-4d61-86b1-3ce8584964f3" containerName="mariadb-account-create-update" Dec 08 20:23:30 crc kubenswrapper[4781]: E1208 20:23:30.081513 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89107c30-6f6d-45ff-a4f2-3a956e78c16c" containerName="mariadb-account-create-update" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081521 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="89107c30-6f6d-45ff-a4f2-3a956e78c16c" containerName="mariadb-account-create-update" Dec 08 20:23:30 crc kubenswrapper[4781]: E1208 20:23:30.081537 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" containerName="dnsmasq-dns" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081544 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" containerName="dnsmasq-dns" Dec 08 20:23:30 crc kubenswrapper[4781]: E1208 20:23:30.081560 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="104209dd-ab9e-4353-8a27-91f71e6ce510" containerName="mariadb-account-create-update" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081568 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="104209dd-ab9e-4353-8a27-91f71e6ce510" containerName="mariadb-account-create-update" Dec 08 20:23:30 crc kubenswrapper[4781]: E1208 20:23:30.081581 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf" containerName="mariadb-database-create" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081588 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf" containerName="mariadb-database-create" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081764 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="89107c30-6f6d-45ff-a4f2-3a956e78c16c" containerName="mariadb-account-create-update" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081777 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf" containerName="mariadb-database-create" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081793 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" containerName="dnsmasq-dns" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081809 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="63df1682-24c8-47c7-9140-8ec51934bd3c" containerName="mariadb-database-create" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081825 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdc65348-cbcf-4d61-86b1-3ce8584964f3" containerName="mariadb-account-create-update" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081834 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="104209dd-ab9e-4353-8a27-91f71e6ce510" containerName="mariadb-account-create-update" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081845 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2759aaa8-f901-4a75-9341-5defd0024b8e" containerName="keystone-db-sync" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.081859 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8b52eea-5a05-48e1-ad62-712af19a2e8c" containerName="mariadb-database-create" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.082956 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.089466 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-dns-svc\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.089546 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-ovsdbserver-nb\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.089592 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-ovsdbserver-sb\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.089646 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-dns-swift-storage-0\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.089668 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-config\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.089683 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpnw4\" (UniqueName: \"kubernetes.io/projected/13fc3ed9-1b10-4793-bc58-ab555d22827c-kube-api-access-tpnw4\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.106546 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lnhsl"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.107593 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.112984 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.113204 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.113311 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.113613 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4bstc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.118608 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.149092 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5689975857-zcq4k"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.194990 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-ovsdbserver-nb\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.195084 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-ovsdbserver-sb\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.195139 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-dns-swift-storage-0\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.195160 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-config\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.195179 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpnw4\" (UniqueName: \"kubernetes.io/projected/13fc3ed9-1b10-4793-bc58-ab555d22827c-kube-api-access-tpnw4\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.195203 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-dns-svc\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.196015 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-dns-svc\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.197319 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-ovsdbserver-nb\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.197902 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-ovsdbserver-sb\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.201178 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-config\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.201877 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-dns-swift-storage-0\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.206650 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lnhsl"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.233953 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-thdb8"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.245844 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpnw4\" (UniqueName: \"kubernetes.io/projected/13fc3ed9-1b10-4793-bc58-ab555d22827c-kube-api-access-tpnw4\") pod \"dnsmasq-dns-5689975857-zcq4k\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.246628 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-thdb8"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.271903 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-747cc54cb5-cgsnc"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.273217 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.277071 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.277320 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-bkmvs" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.277627 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.277759 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.297617 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-fernet-keys\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.297815 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-combined-ca-bundle\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.297862 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-config-data\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.297896 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-credential-keys\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.297982 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-scripts\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.298057 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtgdj\" (UniqueName: \"kubernetes.io/projected/9d1074ff-6031-42ce-85b7-acfed9dad04d-kube-api-access-xtgdj\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.299650 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-747cc54cb5-cgsnc"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.401103 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-scripts\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.401200 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40fdaf60-ff95-49e7-9c00-622d211a969b-logs\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.401233 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40fdaf60-ff95-49e7-9c00-622d211a969b-horizon-secret-key\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.401273 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtgdj\" (UniqueName: \"kubernetes.io/projected/9d1074ff-6031-42ce-85b7-acfed9dad04d-kube-api-access-xtgdj\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.401318 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40fdaf60-ff95-49e7-9c00-622d211a969b-config-data\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.401358 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-fernet-keys\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.401434 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40fdaf60-ff95-49e7-9c00-622d211a969b-scripts\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.401465 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-combined-ca-bundle\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.401489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-config-data\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.401512 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzrdj\" (UniqueName: \"kubernetes.io/projected/40fdaf60-ff95-49e7-9c00-622d211a969b-kube-api-access-qzrdj\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.401537 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-credential-keys\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.414403 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-5ls9p"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.419830 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5ls9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.419902 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-config-data\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.421806 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cdfmn" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.423042 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.423378 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.425036 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-75vwt"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.426045 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.427877 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nmjwb" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.428910 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.430852 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-combined-ca-bundle\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.431167 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-scripts\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.431506 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-credential-keys\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.437494 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.438068 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-75vwt"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.438378 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.464627 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtgdj\" (UniqueName: \"kubernetes.io/projected/9d1074ff-6031-42ce-85b7-acfed9dad04d-kube-api-access-xtgdj\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.477517 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-fernet-keys\") pod \"keystone-bootstrap-lnhsl\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.484926 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-5ls9p"] Dec 08 20:23:30 crc kubenswrapper[4781]: E1208 20:23:30.491317 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65e8d6e4_c7b8_4aca_8df5_ec3f9a655cb0.slice\": RecentStats: unable to find data in memory cache]" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.517444 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q8km\" (UniqueName: \"kubernetes.io/projected/499d0466-ecbe-4866-b516-3c778c16ec94-kube-api-access-5q8km\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.517504 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40fdaf60-ff95-49e7-9c00-622d211a969b-config-data\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.520538 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-scripts\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.520617 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6c7297-a2fe-4569-b0a2-a0a2329df115-combined-ca-bundle\") pod \"neutron-db-sync-5ls9p\" (UID: \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\") " pod="openstack/neutron-db-sync-5ls9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.520715 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40fdaf60-ff95-49e7-9c00-622d211a969b-scripts\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.520743 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ac6c7297-a2fe-4569-b0a2-a0a2329df115-config\") pod \"neutron-db-sync-5ls9p\" (UID: \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\") " pod="openstack/neutron-db-sync-5ls9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.520781 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzrdj\" (UniqueName: \"kubernetes.io/projected/40fdaf60-ff95-49e7-9c00-622d211a969b-kube-api-access-qzrdj\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.520810 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-combined-ca-bundle\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.520842 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-config-data\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.520905 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-db-sync-config-data\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.520965 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl9tk\" (UniqueName: \"kubernetes.io/projected/ac6c7297-a2fe-4569-b0a2-a0a2329df115-kube-api-access-fl9tk\") pod \"neutron-db-sync-5ls9p\" (UID: \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\") " pod="openstack/neutron-db-sync-5ls9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.520987 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/499d0466-ecbe-4866-b516-3c778c16ec94-etc-machine-id\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.521006 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40fdaf60-ff95-49e7-9c00-622d211a969b-logs\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.521026 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40fdaf60-ff95-49e7-9c00-622d211a969b-horizon-secret-key\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.521566 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40fdaf60-ff95-49e7-9c00-622d211a969b-logs\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.522319 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40fdaf60-ff95-49e7-9c00-622d211a969b-config-data\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.522686 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40fdaf60-ff95-49e7-9c00-622d211a969b-scripts\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.525975 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40fdaf60-ff95-49e7-9c00-622d211a969b-horizon-secret-key\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.556387 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzrdj\" (UniqueName: \"kubernetes.io/projected/40fdaf60-ff95-49e7-9c00-622d211a969b-kube-api-access-qzrdj\") pod \"horizon-747cc54cb5-cgsnc\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.579975 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.582093 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.590401 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.590604 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.623527 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-9wz9p"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.624961 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9wz9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.628550 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4f784" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.628767 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.633006 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-combined-ca-bundle\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.633109 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-config-data\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.633177 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-db-sync-config-data\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.633231 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl9tk\" (UniqueName: \"kubernetes.io/projected/ac6c7297-a2fe-4569-b0a2-a0a2329df115-kube-api-access-fl9tk\") pod \"neutron-db-sync-5ls9p\" (UID: \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\") " pod="openstack/neutron-db-sync-5ls9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.633251 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/499d0466-ecbe-4866-b516-3c778c16ec94-etc-machine-id\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.633307 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q8km\" (UniqueName: \"kubernetes.io/projected/499d0466-ecbe-4866-b516-3c778c16ec94-kube-api-access-5q8km\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.633356 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-scripts\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.633425 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6c7297-a2fe-4569-b0a2-a0a2329df115-combined-ca-bundle\") pod \"neutron-db-sync-5ls9p\" (UID: \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\") " pod="openstack/neutron-db-sync-5ls9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.633524 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ac6c7297-a2fe-4569-b0a2-a0a2329df115-config\") pod \"neutron-db-sync-5ls9p\" (UID: \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\") " pod="openstack/neutron-db-sync-5ls9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.638530 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/499d0466-ecbe-4866-b516-3c778c16ec94-etc-machine-id\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.644654 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6c7297-a2fe-4569-b0a2-a0a2329df115-combined-ca-bundle\") pod \"neutron-db-sync-5ls9p\" (UID: \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\") " pod="openstack/neutron-db-sync-5ls9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.644969 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ac6c7297-a2fe-4569-b0a2-a0a2329df115-config\") pod \"neutron-db-sync-5ls9p\" (UID: \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\") " pod="openstack/neutron-db-sync-5ls9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.645116 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-db-sync-config-data\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.645632 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-combined-ca-bundle\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.645670 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.649947 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-config-data\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.651612 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.657616 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-scripts\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.677681 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q8km\" (UniqueName: \"kubernetes.io/projected/499d0466-ecbe-4866-b516-3c778c16ec94-kube-api-access-5q8km\") pod \"cinder-db-sync-75vwt\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.695088 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl9tk\" (UniqueName: \"kubernetes.io/projected/ac6c7297-a2fe-4569-b0a2-a0a2329df115-kube-api-access-fl9tk\") pod \"neutron-db-sync-5ls9p\" (UID: \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\") " pod="openstack/neutron-db-sync-5ls9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.697905 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9wz9p"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.728828 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-594cfd6f5c-pppkp"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.730388 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736170 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb9fz\" (UniqueName: \"kubernetes.io/projected/da20c49c-5d8d-4701-a54b-c6ae7b6670db-kube-api-access-gb9fz\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736468 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjgtj\" (UniqueName: \"kubernetes.io/projected/279c45d3-ece4-42b6-8968-90806c171bf9-kube-api-access-tjgtj\") pod \"barbican-db-sync-9wz9p\" (UID: \"279c45d3-ece4-42b6-8968-90806c171bf9\") " pod="openstack/barbican-db-sync-9wz9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736496 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/965ff155-f5c6-4c49-9d46-18a62ef94308-logs\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736521 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/965ff155-f5c6-4c49-9d46-18a62ef94308-scripts\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736537 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/965ff155-f5c6-4c49-9d46-18a62ef94308-config-data\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736561 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flw9b\" (UniqueName: \"kubernetes.io/projected/965ff155-f5c6-4c49-9d46-18a62ef94308-kube-api-access-flw9b\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736589 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736611 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-config-data\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736668 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da20c49c-5d8d-4701-a54b-c6ae7b6670db-run-httpd\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736685 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/965ff155-f5c6-4c49-9d46-18a62ef94308-horizon-secret-key\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736701 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279c45d3-ece4-42b6-8968-90806c171bf9-combined-ca-bundle\") pod \"barbican-db-sync-9wz9p\" (UID: \"279c45d3-ece4-42b6-8968-90806c171bf9\") " pod="openstack/barbican-db-sync-9wz9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736726 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736746 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-scripts\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736762 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/279c45d3-ece4-42b6-8968-90806c171bf9-db-sync-config-data\") pod \"barbican-db-sync-9wz9p\" (UID: \"279c45d3-ece4-42b6-8968-90806c171bf9\") " pod="openstack/barbican-db-sync-9wz9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.736785 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da20c49c-5d8d-4701-a54b-c6ae7b6670db-log-httpd\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.752987 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.755742 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-594cfd6f5c-pppkp"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.796428 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.801497 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.804126 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4d2w8" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.804324 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.804648 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.804782 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.811781 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5689975857-zcq4k"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.840103 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flw9b\" (UniqueName: \"kubernetes.io/projected/965ff155-f5c6-4c49-9d46-18a62ef94308-kube-api-access-flw9b\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.843706 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.853943 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.855242 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-config-data\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.855350 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da20c49c-5d8d-4701-a54b-c6ae7b6670db-run-httpd\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.855379 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/965ff155-f5c6-4c49-9d46-18a62ef94308-horizon-secret-key\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.855406 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279c45d3-ece4-42b6-8968-90806c171bf9-combined-ca-bundle\") pod \"barbican-db-sync-9wz9p\" (UID: \"279c45d3-ece4-42b6-8968-90806c171bf9\") " pod="openstack/barbican-db-sync-9wz9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.855443 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.855474 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-scripts\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.855498 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/279c45d3-ece4-42b6-8968-90806c171bf9-db-sync-config-data\") pod \"barbican-db-sync-9wz9p\" (UID: \"279c45d3-ece4-42b6-8968-90806c171bf9\") " pod="openstack/barbican-db-sync-9wz9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.855559 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da20c49c-5d8d-4701-a54b-c6ae7b6670db-log-httpd\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.855691 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb9fz\" (UniqueName: \"kubernetes.io/projected/da20c49c-5d8d-4701-a54b-c6ae7b6670db-kube-api-access-gb9fz\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.855731 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjgtj\" (UniqueName: \"kubernetes.io/projected/279c45d3-ece4-42b6-8968-90806c171bf9-kube-api-access-tjgtj\") pod \"barbican-db-sync-9wz9p\" (UID: \"279c45d3-ece4-42b6-8968-90806c171bf9\") " pod="openstack/barbican-db-sync-9wz9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.855757 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/965ff155-f5c6-4c49-9d46-18a62ef94308-logs\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.855778 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/965ff155-f5c6-4c49-9d46-18a62ef94308-scripts\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.855799 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/965ff155-f5c6-4c49-9d46-18a62ef94308-config-data\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.857140 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/965ff155-f5c6-4c49-9d46-18a62ef94308-config-data\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.857601 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/965ff155-f5c6-4c49-9d46-18a62ef94308-logs\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.861224 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/965ff155-f5c6-4c49-9d46-18a62ef94308-scripts\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.861679 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da20c49c-5d8d-4701-a54b-c6ae7b6670db-log-httpd\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.862396 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flw9b\" (UniqueName: \"kubernetes.io/projected/965ff155-f5c6-4c49-9d46-18a62ef94308-kube-api-access-flw9b\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.864905 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/279c45d3-ece4-42b6-8968-90806c171bf9-db-sync-config-data\") pod \"barbican-db-sync-9wz9p\" (UID: \"279c45d3-ece4-42b6-8968-90806c171bf9\") " pod="openstack/barbican-db-sync-9wz9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.865199 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da20c49c-5d8d-4701-a54b-c6ae7b6670db-run-httpd\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.866384 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/965ff155-f5c6-4c49-9d46-18a62ef94308-horizon-secret-key\") pod \"horizon-594cfd6f5c-pppkp\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.868254 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-bnfg8"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.870882 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.874212 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-scripts\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.874836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.874899 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.875234 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.875392 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sthd5" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.879560 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjgtj\" (UniqueName: \"kubernetes.io/projected/279c45d3-ece4-42b6-8968-90806c171bf9-kube-api-access-tjgtj\") pod \"barbican-db-sync-9wz9p\" (UID: \"279c45d3-ece4-42b6-8968-90806c171bf9\") " pod="openstack/barbican-db-sync-9wz9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.875910 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-config-data\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.886829 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb9fz\" (UniqueName: \"kubernetes.io/projected/da20c49c-5d8d-4701-a54b-c6ae7b6670db-kube-api-access-gb9fz\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.887405 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279c45d3-ece4-42b6-8968-90806c171bf9-combined-ca-bundle\") pod \"barbican-db-sync-9wz9p\" (UID: \"279c45d3-ece4-42b6-8968-90806c171bf9\") " pod="openstack/barbican-db-sync-9wz9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.893156 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-lk5tr"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.894904 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.905108 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.905189 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-bnfg8"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.918671 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5ls9p" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.937389 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-lk5tr"] Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.945462 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-75vwt" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.958379 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e29d9331-4757-45f1-b92d-affdf17f3a09-logs\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.958594 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.958635 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.958678 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.958707 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-scripts\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.958764 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq97l\" (UniqueName: \"kubernetes.io/projected/e29d9331-4757-45f1-b92d-affdf17f3a09-kube-api-access-gq97l\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.958807 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-config-data\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.958852 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e29d9331-4757-45f1-b92d-affdf17f3a09-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.974108 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:23:30 crc kubenswrapper[4781]: I1208 20:23:30.996902 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9wz9p" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.060621 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-config-data\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.060680 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e29d9331-4757-45f1-b92d-affdf17f3a09-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.060714 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-ovsdbserver-sb\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.060771 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjpht\" (UniqueName: \"kubernetes.io/projected/c4245549-33f8-4a0e-a17e-08417bed869c-kube-api-access-zjpht\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.060810 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-scripts\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.060853 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e29d9331-4757-45f1-b92d-affdf17f3a09-logs\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.060879 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-combined-ca-bundle\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.060971 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061009 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061037 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlnch\" (UniqueName: \"kubernetes.io/projected/8bf90300-5bf4-4ec5-ba4b-6145663748fa-kube-api-access-tlnch\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061076 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061100 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-scripts\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061126 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-config\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061147 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-dns-swift-storage-0\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061177 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-ovsdbserver-nb\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq97l\" (UniqueName: \"kubernetes.io/projected/e29d9331-4757-45f1-b92d-affdf17f3a09-kube-api-access-gq97l\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061237 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-dns-svc\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061257 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4245549-33f8-4a0e-a17e-08417bed869c-logs\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061284 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-config-data\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061442 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e29d9331-4757-45f1-b92d-affdf17f3a09-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061462 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e29d9331-4757-45f1-b92d-affdf17f3a09-logs\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.061818 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.065736 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.068598 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.091131 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-config-data\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.091827 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-scripts\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.093495 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.103400 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq97l\" (UniqueName: \"kubernetes.io/projected/e29d9331-4757-45f1-b92d-affdf17f3a09-kube-api-access-gq97l\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.163702 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlnch\" (UniqueName: \"kubernetes.io/projected/8bf90300-5bf4-4ec5-ba4b-6145663748fa-kube-api-access-tlnch\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.163791 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-config\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.163816 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-dns-swift-storage-0\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.163848 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-ovsdbserver-nb\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.163886 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-dns-svc\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.163910 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4245549-33f8-4a0e-a17e-08417bed869c-logs\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.163967 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-config-data\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.163996 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-ovsdbserver-sb\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.164035 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjpht\" (UniqueName: \"kubernetes.io/projected/c4245549-33f8-4a0e-a17e-08417bed869c-kube-api-access-zjpht\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.164068 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-scripts\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.164107 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-combined-ca-bundle\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.167627 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-combined-ca-bundle\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.167776 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-dns-swift-storage-0\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.168808 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-config\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.168900 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-dns-svc\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.177726 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-ovsdbserver-sb\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.178363 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4245549-33f8-4a0e-a17e-08417bed869c-logs\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.178373 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-ovsdbserver-nb\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.183747 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.185452 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-config-data\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.199402 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlnch\" (UniqueName: \"kubernetes.io/projected/8bf90300-5bf4-4ec5-ba4b-6145663748fa-kube-api-access-tlnch\") pod \"dnsmasq-dns-74fd8b655f-lk5tr\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.216760 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-scripts\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.219199 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjpht\" (UniqueName: \"kubernetes.io/projected/c4245549-33f8-4a0e-a17e-08417bed869c-kube-api-access-zjpht\") pod \"placement-db-sync-bnfg8\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.230141 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-747cc54cb5-cgsnc"] Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.231667 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:31 crc kubenswrapper[4781]: I1208 20:23:31.239910 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5689975857-zcq4k"] Dec 08 20:23:31 crc kubenswrapper[4781]: W1208 20:23:31.245943 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13fc3ed9_1b10_4793_bc58_ab555d22827c.slice/crio-9759b3d9f0215a54cf4cb636ac1a221b052e2a1e13358e9df297f08941284731 WatchSource:0}: Error finding container 9759b3d9f0215a54cf4cb636ac1a221b052e2a1e13358e9df297f08941284731: Status 404 returned error can't find the container with id 9759b3d9f0215a54cf4cb636ac1a221b052e2a1e13358e9df297f08941284731 Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.357327 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.359443 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.366617 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.366721 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.377788 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.440139 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.470467 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.470816 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.470887 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.470934 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.470979 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl4zf\" (UniqueName: \"kubernetes.io/projected/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-kube-api-access-jl4zf\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.471018 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.471062 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.471085 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-logs\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.492201 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lnhsl"] Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.507449 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bnfg8" Dec 08 20:23:32 crc kubenswrapper[4781]: W1208 20:23:31.516509 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d1074ff_6031_42ce_85b7_acfed9dad04d.slice/crio-2485280c0290dff409b5573ed440e3d66e276a8ad6aae72a28d9678056baf473 WatchSource:0}: Error finding container 2485280c0290dff409b5573ed440e3d66e276a8ad6aae72a28d9678056baf473: Status 404 returned error can't find the container with id 2485280c0290dff409b5573ed440e3d66e276a8ad6aae72a28d9678056baf473 Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.572489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.572540 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-logs\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.572575 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.572611 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.572663 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.572690 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.572726 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl4zf\" (UniqueName: \"kubernetes.io/projected/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-kube-api-access-jl4zf\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.572761 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.573438 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.574186 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.580803 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-logs\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.586092 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.604333 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.609097 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.609849 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.612429 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl4zf\" (UniqueName: \"kubernetes.io/projected/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-kube-api-access-jl4zf\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.618580 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.845451 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.856554 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5689975857-zcq4k" event={"ID":"13fc3ed9-1b10-4793-bc58-ab555d22827c","Type":"ContainerStarted","Data":"9759b3d9f0215a54cf4cb636ac1a221b052e2a1e13358e9df297f08941284731"} Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.857832 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lnhsl" event={"ID":"9d1074ff-6031-42ce-85b7-acfed9dad04d","Type":"ContainerStarted","Data":"2485280c0290dff409b5573ed440e3d66e276a8ad6aae72a28d9678056baf473"} Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:31.859208 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-747cc54cb5-cgsnc" event={"ID":"40fdaf60-ff95-49e7-9c00-622d211a969b","Type":"ContainerStarted","Data":"4be2c2ba78430c1491982524a6a94677a585c292356b299143f9c1ec9bcd2b37"} Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.137196 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0" path="/var/lib/kubelet/pods/65e8d6e4-c7b8-4aca-8df5-ec3f9a655cb0/volumes" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.577627 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.604991 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-594cfd6f5c-pppkp"] Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.648126 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76b8bd6cbf-rvflb"] Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.649682 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.675974 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.688270 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76b8bd6cbf-rvflb"] Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.791753 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ebf1a9-601e-4127-afcd-641216ae3a11-logs\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.791842 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75ebf1a9-601e-4127-afcd-641216ae3a11-config-data\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.791871 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75ebf1a9-601e-4127-afcd-641216ae3a11-scripts\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.791934 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25gjk\" (UniqueName: \"kubernetes.io/projected/75ebf1a9-601e-4127-afcd-641216ae3a11-kube-api-access-25gjk\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.791993 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/75ebf1a9-601e-4127-afcd-641216ae3a11-horizon-secret-key\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.869848 4781 generic.go:334] "Generic (PLEG): container finished" podID="13fc3ed9-1b10-4793-bc58-ab555d22827c" containerID="aa520af3124c396effa212b7e6915fdff0d8cd9e45c2181aafb8b37f96ab296d" exitCode=0 Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.869951 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5689975857-zcq4k" event={"ID":"13fc3ed9-1b10-4793-bc58-ab555d22827c","Type":"ContainerDied","Data":"aa520af3124c396effa212b7e6915fdff0d8cd9e45c2181aafb8b37f96ab296d"} Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.873590 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lnhsl" event={"ID":"9d1074ff-6031-42ce-85b7-acfed9dad04d","Type":"ContainerStarted","Data":"4002d2d62cc373bdd4cdd8a9a775ab3646cad2b3bf679696b00bf99a3fe6f6ec"} Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.894159 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/75ebf1a9-601e-4127-afcd-641216ae3a11-horizon-secret-key\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.894977 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ebf1a9-601e-4127-afcd-641216ae3a11-logs\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.895323 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75ebf1a9-601e-4127-afcd-641216ae3a11-config-data\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.895401 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75ebf1a9-601e-4127-afcd-641216ae3a11-scripts\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.895607 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25gjk\" (UniqueName: \"kubernetes.io/projected/75ebf1a9-601e-4127-afcd-641216ae3a11-kube-api-access-25gjk\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.897021 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ebf1a9-601e-4127-afcd-641216ae3a11-logs\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.898745 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75ebf1a9-601e-4127-afcd-641216ae3a11-scripts\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.899008 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75ebf1a9-601e-4127-afcd-641216ae3a11-config-data\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.924826 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25gjk\" (UniqueName: \"kubernetes.io/projected/75ebf1a9-601e-4127-afcd-641216ae3a11-kube-api-access-25gjk\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.931363 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lnhsl" podStartSLOduration=2.931343044 podStartE2EDuration="2.931343044s" podCreationTimestamp="2025-12-08 20:23:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:23:32.914694695 +0000 UTC m=+1129.065978072" watchObservedRunningTime="2025-12-08 20:23:32.931343044 +0000 UTC m=+1129.082626411" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.954440 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/75ebf1a9-601e-4127-afcd-641216ae3a11-horizon-secret-key\") pod \"horizon-76b8bd6cbf-rvflb\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:32 crc kubenswrapper[4781]: I1208 20:23:32.983996 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.059418 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.110039 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-75vwt"] Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.230891 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-5ls9p"] Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.666641 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.697799 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:23:33 crc kubenswrapper[4781]: W1208 20:23:33.701105 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda20c49c_5d8d_4701_a54b_c6ae7b6670db.slice/crio-e67454e5f5b46af58d24c6f9f08ad90daba19b463d3f3259d2aefbcddd4d2ad7 WatchSource:0}: Error finding container e67454e5f5b46af58d24c6f9f08ad90daba19b463d3f3259d2aefbcddd4d2ad7: Status 404 returned error can't find the container with id e67454e5f5b46af58d24c6f9f08ad90daba19b463d3f3259d2aefbcddd4d2ad7 Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.711245 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-594cfd6f5c-pppkp"] Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.754177 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-bnfg8"] Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.763725 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-lk5tr"] Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.775266 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9wz9p"] Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.785764 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-config\") pod \"13fc3ed9-1b10-4793-bc58-ab555d22827c\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.785831 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-dns-svc\") pod \"13fc3ed9-1b10-4793-bc58-ab555d22827c\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.785952 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpnw4\" (UniqueName: \"kubernetes.io/projected/13fc3ed9-1b10-4793-bc58-ab555d22827c-kube-api-access-tpnw4\") pod \"13fc3ed9-1b10-4793-bc58-ab555d22827c\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.786077 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-ovsdbserver-nb\") pod \"13fc3ed9-1b10-4793-bc58-ab555d22827c\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.786149 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-ovsdbserver-sb\") pod \"13fc3ed9-1b10-4793-bc58-ab555d22827c\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.786210 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-dns-swift-storage-0\") pod \"13fc3ed9-1b10-4793-bc58-ab555d22827c\" (UID: \"13fc3ed9-1b10-4793-bc58-ab555d22827c\") " Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.799108 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13fc3ed9-1b10-4793-bc58-ab555d22827c-kube-api-access-tpnw4" (OuterVolumeSpecName: "kube-api-access-tpnw4") pod "13fc3ed9-1b10-4793-bc58-ab555d22827c" (UID: "13fc3ed9-1b10-4793-bc58-ab555d22827c"). InnerVolumeSpecName "kube-api-access-tpnw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.820436 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "13fc3ed9-1b10-4793-bc58-ab555d22827c" (UID: "13fc3ed9-1b10-4793-bc58-ab555d22827c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.844522 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "13fc3ed9-1b10-4793-bc58-ab555d22827c" (UID: "13fc3ed9-1b10-4793-bc58-ab555d22827c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.849736 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76b8bd6cbf-rvflb"] Dec 08 20:23:33 crc kubenswrapper[4781]: W1208 20:23:33.869463 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75ebf1a9_601e_4127_afcd_641216ae3a11.slice/crio-44433e51799023a2926b6443fc1c14a5ee28ecaf2c203d113a46ca8f4cc35b09 WatchSource:0}: Error finding container 44433e51799023a2926b6443fc1c14a5ee28ecaf2c203d113a46ca8f4cc35b09: Status 404 returned error can't find the container with id 44433e51799023a2926b6443fc1c14a5ee28ecaf2c203d113a46ca8f4cc35b09 Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.878808 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-config" (OuterVolumeSpecName: "config") pod "13fc3ed9-1b10-4793-bc58-ab555d22827c" (UID: "13fc3ed9-1b10-4793-bc58-ab555d22827c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.879734 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "13fc3ed9-1b10-4793-bc58-ab555d22827c" (UID: "13fc3ed9-1b10-4793-bc58-ab555d22827c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.881229 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "13fc3ed9-1b10-4793-bc58-ab555d22827c" (UID: "13fc3ed9-1b10-4793-bc58-ab555d22827c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.888480 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.888520 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.888532 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.888545 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.888595 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/13fc3ed9-1b10-4793-bc58-ab555d22827c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.888609 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpnw4\" (UniqueName: \"kubernetes.io/projected/13fc3ed9-1b10-4793-bc58-ab555d22827c-kube-api-access-tpnw4\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.897528 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.898480 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5689975857-zcq4k" event={"ID":"13fc3ed9-1b10-4793-bc58-ab555d22827c","Type":"ContainerDied","Data":"9759b3d9f0215a54cf4cb636ac1a221b052e2a1e13358e9df297f08941284731"} Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.898538 4781 scope.go:117] "RemoveContainer" containerID="aa520af3124c396effa212b7e6915fdff0d8cd9e45c2181aafb8b37f96ab296d" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.898635 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5689975857-zcq4k" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.900945 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5ls9p" event={"ID":"ac6c7297-a2fe-4569-b0a2-a0a2329df115","Type":"ContainerStarted","Data":"d859300ce3f58859e8c5cf261bb796ad5a7b29f7f6825fab7f30e4a129604b9e"} Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.900989 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5ls9p" event={"ID":"ac6c7297-a2fe-4569-b0a2-a0a2329df115","Type":"ContainerStarted","Data":"4c8b74eb159c2b8407f26753d5466c00fce9d87d054dbdc41ff01f4544b3d9e0"} Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.903062 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" event={"ID":"8bf90300-5bf4-4ec5-ba4b-6145663748fa","Type":"ContainerStarted","Data":"637f931e83d20cff4432c8c36ea7f552ea6f30f23da92d69db600aaeda68e4f4"} Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.904507 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594cfd6f5c-pppkp" event={"ID":"965ff155-f5c6-4c49-9d46-18a62ef94308","Type":"ContainerStarted","Data":"9222305ff5447d60724e8dc626561bdb321cbe8de9906873a5a8590e54b90779"} Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.907957 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-75vwt" event={"ID":"499d0466-ecbe-4866-b516-3c778c16ec94","Type":"ContainerStarted","Data":"f16d2c47262058cb18175d4dab6fbb4dd2f2cb942ba60bec0233849a27bbed4a"} Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.914171 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da20c49c-5d8d-4701-a54b-c6ae7b6670db","Type":"ContainerStarted","Data":"e67454e5f5b46af58d24c6f9f08ad90daba19b463d3f3259d2aefbcddd4d2ad7"} Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.920365 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76b8bd6cbf-rvflb" event={"ID":"75ebf1a9-601e-4127-afcd-641216ae3a11","Type":"ContainerStarted","Data":"44433e51799023a2926b6443fc1c14a5ee28ecaf2c203d113a46ca8f4cc35b09"} Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.922341 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-5ls9p" podStartSLOduration=3.922318914 podStartE2EDuration="3.922318914s" podCreationTimestamp="2025-12-08 20:23:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:23:33.920218704 +0000 UTC m=+1130.071502081" watchObservedRunningTime="2025-12-08 20:23:33.922318914 +0000 UTC m=+1130.073602291" Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.929744 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bnfg8" event={"ID":"c4245549-33f8-4a0e-a17e-08417bed869c","Type":"ContainerStarted","Data":"37fa6cbb207827f3301373eaa82b6522efe52d4b8600467412da9bd8686099f6"} Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.933261 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9wz9p" event={"ID":"279c45d3-ece4-42b6-8968-90806c171bf9","Type":"ContainerStarted","Data":"9fa7d917a5f507ce4d588bb23b23ce0e2b3b22ff604c02a56f29536150a355db"} Dec 08 20:23:33 crc kubenswrapper[4781]: W1208 20:23:33.944227 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cf3ad3d_1ea0_4fa3_ac7e_ee56ee398f70.slice/crio-777070978b62f8972f29e572c19efd010f716e1e640df2b0e90e06e21f5b9577 WatchSource:0}: Error finding container 777070978b62f8972f29e572c19efd010f716e1e640df2b0e90e06e21f5b9577: Status 404 returned error can't find the container with id 777070978b62f8972f29e572c19efd010f716e1e640df2b0e90e06e21f5b9577 Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.978883 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5689975857-zcq4k"] Dec 08 20:23:33 crc kubenswrapper[4781]: I1208 20:23:33.986213 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5689975857-zcq4k"] Dec 08 20:23:34 crc kubenswrapper[4781]: I1208 20:23:34.164731 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13fc3ed9-1b10-4793-bc58-ab555d22827c" path="/var/lib/kubelet/pods/13fc3ed9-1b10-4793-bc58-ab555d22827c/volumes" Dec 08 20:23:34 crc kubenswrapper[4781]: I1208 20:23:34.465931 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:23:34 crc kubenswrapper[4781]: W1208 20:23:34.535575 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode29d9331_4757_45f1_b92d_affdf17f3a09.slice/crio-baeb86e64f9641c759989125361156e75efcb95328dd8668d84c360b59c4511f WatchSource:0}: Error finding container baeb86e64f9641c759989125361156e75efcb95328dd8668d84c360b59c4511f: Status 404 returned error can't find the container with id baeb86e64f9641c759989125361156e75efcb95328dd8668d84c360b59c4511f Dec 08 20:23:34 crc kubenswrapper[4781]: I1208 20:23:34.954118 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70","Type":"ContainerStarted","Data":"777070978b62f8972f29e572c19efd010f716e1e640df2b0e90e06e21f5b9577"} Dec 08 20:23:34 crc kubenswrapper[4781]: I1208 20:23:34.955477 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e29d9331-4757-45f1-b92d-affdf17f3a09","Type":"ContainerStarted","Data":"baeb86e64f9641c759989125361156e75efcb95328dd8668d84c360b59c4511f"} Dec 08 20:23:34 crc kubenswrapper[4781]: I1208 20:23:34.957339 4781 generic.go:334] "Generic (PLEG): container finished" podID="8bf90300-5bf4-4ec5-ba4b-6145663748fa" containerID="61e1d59585782f2e2c7e98ef23a5b5ae06610dfdd07a3441aa925806f81db1c0" exitCode=0 Dec 08 20:23:34 crc kubenswrapper[4781]: I1208 20:23:34.957508 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" event={"ID":"8bf90300-5bf4-4ec5-ba4b-6145663748fa","Type":"ContainerDied","Data":"61e1d59585782f2e2c7e98ef23a5b5ae06610dfdd07a3441aa925806f81db1c0"} Dec 08 20:23:35 crc kubenswrapper[4781]: I1208 20:23:35.984482 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" event={"ID":"8bf90300-5bf4-4ec5-ba4b-6145663748fa","Type":"ContainerStarted","Data":"bf42e12ed62d1871d0c5d435c0b91e48e8ea830556985295a2dec250f5258ab3"} Dec 08 20:23:35 crc kubenswrapper[4781]: I1208 20:23:35.986149 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:35 crc kubenswrapper[4781]: I1208 20:23:35.996675 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70","Type":"ContainerStarted","Data":"e17c07cbd65814e184dcb385361322a3da697853b6f5a2d6efdd5af7aca86ca0"} Dec 08 20:23:36 crc kubenswrapper[4781]: I1208 20:23:36.008788 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" podStartSLOduration=6.008766206 podStartE2EDuration="6.008766206s" podCreationTimestamp="2025-12-08 20:23:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:23:36.007242702 +0000 UTC m=+1132.158526079" watchObservedRunningTime="2025-12-08 20:23:36.008766206 +0000 UTC m=+1132.160049583" Dec 08 20:23:37 crc kubenswrapper[4781]: I1208 20:23:37.010577 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70","Type":"ContainerStarted","Data":"6528f5f6e505b33293a5f68a521362a16d1e4a97cb27ce5ef64333977e533f83"} Dec 08 20:23:37 crc kubenswrapper[4781]: I1208 20:23:37.010715 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" containerName="glance-httpd" containerID="cri-o://6528f5f6e505b33293a5f68a521362a16d1e4a97cb27ce5ef64333977e533f83" gracePeriod=30 Dec 08 20:23:37 crc kubenswrapper[4781]: I1208 20:23:37.011035 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" containerName="glance-log" containerID="cri-o://e17c07cbd65814e184dcb385361322a3da697853b6f5a2d6efdd5af7aca86ca0" gracePeriod=30 Dec 08 20:23:37 crc kubenswrapper[4781]: I1208 20:23:37.014157 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e29d9331-4757-45f1-b92d-affdf17f3a09","Type":"ContainerStarted","Data":"f058f0d5b3a17571a9cad81a1f1b9e051b85d6250f26b884666f25483a49a5c5"} Dec 08 20:23:37 crc kubenswrapper[4781]: I1208 20:23:37.039835 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.039813547 podStartE2EDuration="7.039813547s" podCreationTimestamp="2025-12-08 20:23:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:23:37.03714035 +0000 UTC m=+1133.188423747" watchObservedRunningTime="2025-12-08 20:23:37.039813547 +0000 UTC m=+1133.191096924" Dec 08 20:23:38 crc kubenswrapper[4781]: I1208 20:23:38.034734 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e29d9331-4757-45f1-b92d-affdf17f3a09","Type":"ContainerStarted","Data":"56a45b606f103f794d5e228ec39cacb874cd460f19fe69d24523a210e4f8cbdf"} Dec 08 20:23:38 crc kubenswrapper[4781]: I1208 20:23:38.034838 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e29d9331-4757-45f1-b92d-affdf17f3a09" containerName="glance-log" containerID="cri-o://f058f0d5b3a17571a9cad81a1f1b9e051b85d6250f26b884666f25483a49a5c5" gracePeriod=30 Dec 08 20:23:38 crc kubenswrapper[4781]: I1208 20:23:38.035131 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e29d9331-4757-45f1-b92d-affdf17f3a09" containerName="glance-httpd" containerID="cri-o://56a45b606f103f794d5e228ec39cacb874cd460f19fe69d24523a210e4f8cbdf" gracePeriod=30 Dec 08 20:23:38 crc kubenswrapper[4781]: I1208 20:23:38.037035 4781 generic.go:334] "Generic (PLEG): container finished" podID="9d1074ff-6031-42ce-85b7-acfed9dad04d" containerID="4002d2d62cc373bdd4cdd8a9a775ab3646cad2b3bf679696b00bf99a3fe6f6ec" exitCode=0 Dec 08 20:23:38 crc kubenswrapper[4781]: I1208 20:23:38.037174 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lnhsl" event={"ID":"9d1074ff-6031-42ce-85b7-acfed9dad04d","Type":"ContainerDied","Data":"4002d2d62cc373bdd4cdd8a9a775ab3646cad2b3bf679696b00bf99a3fe6f6ec"} Dec 08 20:23:38 crc kubenswrapper[4781]: I1208 20:23:38.045164 4781 generic.go:334] "Generic (PLEG): container finished" podID="9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" containerID="6528f5f6e505b33293a5f68a521362a16d1e4a97cb27ce5ef64333977e533f83" exitCode=0 Dec 08 20:23:38 crc kubenswrapper[4781]: I1208 20:23:38.045197 4781 generic.go:334] "Generic (PLEG): container finished" podID="9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" containerID="e17c07cbd65814e184dcb385361322a3da697853b6f5a2d6efdd5af7aca86ca0" exitCode=143 Dec 08 20:23:38 crc kubenswrapper[4781]: I1208 20:23:38.045257 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70","Type":"ContainerDied","Data":"6528f5f6e505b33293a5f68a521362a16d1e4a97cb27ce5ef64333977e533f83"} Dec 08 20:23:38 crc kubenswrapper[4781]: I1208 20:23:38.045351 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70","Type":"ContainerDied","Data":"e17c07cbd65814e184dcb385361322a3da697853b6f5a2d6efdd5af7aca86ca0"} Dec 08 20:23:38 crc kubenswrapper[4781]: I1208 20:23:38.064723 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.064699672 podStartE2EDuration="8.064699672s" podCreationTimestamp="2025-12-08 20:23:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:23:38.056498316 +0000 UTC m=+1134.207781713" watchObservedRunningTime="2025-12-08 20:23:38.064699672 +0000 UTC m=+1134.215983049" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.072033 4781 generic.go:334] "Generic (PLEG): container finished" podID="e29d9331-4757-45f1-b92d-affdf17f3a09" containerID="56a45b606f103f794d5e228ec39cacb874cd460f19fe69d24523a210e4f8cbdf" exitCode=0 Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.072080 4781 generic.go:334] "Generic (PLEG): container finished" podID="e29d9331-4757-45f1-b92d-affdf17f3a09" containerID="f058f0d5b3a17571a9cad81a1f1b9e051b85d6250f26b884666f25483a49a5c5" exitCode=143 Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.072118 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e29d9331-4757-45f1-b92d-affdf17f3a09","Type":"ContainerDied","Data":"56a45b606f103f794d5e228ec39cacb874cd460f19fe69d24523a210e4f8cbdf"} Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.072157 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e29d9331-4757-45f1-b92d-affdf17f3a09","Type":"ContainerDied","Data":"f058f0d5b3a17571a9cad81a1f1b9e051b85d6250f26b884666f25483a49a5c5"} Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.555952 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-747cc54cb5-cgsnc"] Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.579223 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-66bd65f4cd-rdlc7"] Dec 08 20:23:39 crc kubenswrapper[4781]: E1208 20:23:39.584711 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13fc3ed9-1b10-4793-bc58-ab555d22827c" containerName="init" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.584855 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="13fc3ed9-1b10-4793-bc58-ab555d22827c" containerName="init" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.585083 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="13fc3ed9-1b10-4793-bc58-ab555d22827c" containerName="init" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.586027 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.588834 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.604891 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66bd65f4cd-rdlc7"] Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.692378 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76b8bd6cbf-rvflb"] Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.719423 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75dd9f77c4-85lhw"] Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.721119 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.734192 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75dd9f77c4-85lhw"] Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.749141 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-horizon-tls-certs\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.749200 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b714ffd-7c31-434d-a833-04abe6c8dcfb-config-data\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.749230 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8mpj\" (UniqueName: \"kubernetes.io/projected/2b714ffd-7c31-434d-a833-04abe6c8dcfb-kube-api-access-n8mpj\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.749291 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-horizon-secret-key\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.749314 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b714ffd-7c31-434d-a833-04abe6c8dcfb-logs\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.749999 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b714ffd-7c31-434d-a833-04abe6c8dcfb-scripts\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.750037 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-combined-ca-bundle\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851310 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b714ffd-7c31-434d-a833-04abe6c8dcfb-scripts\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851361 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-horizon-secret-key\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851414 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-combined-ca-bundle\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851441 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-horizon-tls-certs\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851480 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-horizon-tls-certs\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851514 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-combined-ca-bundle\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851532 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b714ffd-7c31-434d-a833-04abe6c8dcfb-config-data\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851552 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-logs\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851577 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8mpj\" (UniqueName: \"kubernetes.io/projected/2b714ffd-7c31-434d-a833-04abe6c8dcfb-kube-api-access-n8mpj\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851612 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-scripts\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851630 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p52pv\" (UniqueName: \"kubernetes.io/projected/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-kube-api-access-p52pv\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851679 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-config-data\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851839 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-horizon-secret-key\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.851908 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b714ffd-7c31-434d-a833-04abe6c8dcfb-logs\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.852223 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b714ffd-7c31-434d-a833-04abe6c8dcfb-scripts\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.852689 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b714ffd-7c31-434d-a833-04abe6c8dcfb-logs\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.854737 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b714ffd-7c31-434d-a833-04abe6c8dcfb-config-data\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.861869 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-horizon-tls-certs\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.862654 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-combined-ca-bundle\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.864014 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-horizon-secret-key\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.870485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8mpj\" (UniqueName: \"kubernetes.io/projected/2b714ffd-7c31-434d-a833-04abe6c8dcfb-kube-api-access-n8mpj\") pod \"horizon-66bd65f4cd-rdlc7\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.924891 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.953211 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-horizon-secret-key\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.953292 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-horizon-tls-certs\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.953366 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-combined-ca-bundle\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.953390 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-logs\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.953456 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-scripts\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.954018 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-logs\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.954088 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p52pv\" (UniqueName: \"kubernetes.io/projected/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-kube-api-access-p52pv\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.954334 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-scripts\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.954536 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-config-data\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.955630 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-config-data\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.957030 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-combined-ca-bundle\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.957608 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-horizon-secret-key\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.959034 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-horizon-tls-certs\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:39 crc kubenswrapper[4781]: I1208 20:23:39.971873 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p52pv\" (UniqueName: \"kubernetes.io/projected/eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf-kube-api-access-p52pv\") pod \"horizon-75dd9f77c4-85lhw\" (UID: \"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf\") " pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:40 crc kubenswrapper[4781]: I1208 20:23:40.039520 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:23:41 crc kubenswrapper[4781]: I1208 20:23:41.233184 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:23:41 crc kubenswrapper[4781]: I1208 20:23:41.298714 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d879466b9-tx84q"] Dec 08 20:23:41 crc kubenswrapper[4781]: I1208 20:23:41.298958 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d879466b9-tx84q" podUID="1b14a49e-1bf0-4914-9625-14c58c351d6a" containerName="dnsmasq-dns" containerID="cri-o://5f7d25fcf036d3edffa6034a78fbd156419592ca8440b9ff8c70a8e1591f570e" gracePeriod=10 Dec 08 20:23:42 crc kubenswrapper[4781]: I1208 20:23:42.109235 4781 generic.go:334] "Generic (PLEG): container finished" podID="1b14a49e-1bf0-4914-9625-14c58c351d6a" containerID="5f7d25fcf036d3edffa6034a78fbd156419592ca8440b9ff8c70a8e1591f570e" exitCode=0 Dec 08 20:23:42 crc kubenswrapper[4781]: I1208 20:23:42.109287 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d879466b9-tx84q" event={"ID":"1b14a49e-1bf0-4914-9625-14c58c351d6a","Type":"ContainerDied","Data":"5f7d25fcf036d3edffa6034a78fbd156419592ca8440b9ff8c70a8e1591f570e"} Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.273222 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d879466b9-tx84q" podUID="1b14a49e-1bf0-4914-9625-14c58c351d6a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.658861 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.759484 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-public-tls-certs\") pod \"e29d9331-4757-45f1-b92d-affdf17f3a09\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.759565 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e29d9331-4757-45f1-b92d-affdf17f3a09-httpd-run\") pod \"e29d9331-4757-45f1-b92d-affdf17f3a09\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.759589 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-config-data\") pod \"e29d9331-4757-45f1-b92d-affdf17f3a09\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.759700 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-combined-ca-bundle\") pod \"e29d9331-4757-45f1-b92d-affdf17f3a09\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.759757 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e29d9331-4757-45f1-b92d-affdf17f3a09-logs\") pod \"e29d9331-4757-45f1-b92d-affdf17f3a09\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.759777 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"e29d9331-4757-45f1-b92d-affdf17f3a09\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.759810 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq97l\" (UniqueName: \"kubernetes.io/projected/e29d9331-4757-45f1-b92d-affdf17f3a09-kube-api-access-gq97l\") pod \"e29d9331-4757-45f1-b92d-affdf17f3a09\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.759852 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-scripts\") pod \"e29d9331-4757-45f1-b92d-affdf17f3a09\" (UID: \"e29d9331-4757-45f1-b92d-affdf17f3a09\") " Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.760009 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e29d9331-4757-45f1-b92d-affdf17f3a09-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e29d9331-4757-45f1-b92d-affdf17f3a09" (UID: "e29d9331-4757-45f1-b92d-affdf17f3a09"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.760249 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e29d9331-4757-45f1-b92d-affdf17f3a09-logs" (OuterVolumeSpecName: "logs") pod "e29d9331-4757-45f1-b92d-affdf17f3a09" (UID: "e29d9331-4757-45f1-b92d-affdf17f3a09"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.760615 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e29d9331-4757-45f1-b92d-affdf17f3a09-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.760643 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e29d9331-4757-45f1-b92d-affdf17f3a09-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.771119 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-scripts" (OuterVolumeSpecName: "scripts") pod "e29d9331-4757-45f1-b92d-affdf17f3a09" (UID: "e29d9331-4757-45f1-b92d-affdf17f3a09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.771446 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29d9331-4757-45f1-b92d-affdf17f3a09-kube-api-access-gq97l" (OuterVolumeSpecName: "kube-api-access-gq97l") pod "e29d9331-4757-45f1-b92d-affdf17f3a09" (UID: "e29d9331-4757-45f1-b92d-affdf17f3a09"). InnerVolumeSpecName "kube-api-access-gq97l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.786970 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "e29d9331-4757-45f1-b92d-affdf17f3a09" (UID: "e29d9331-4757-45f1-b92d-affdf17f3a09"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.824133 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e29d9331-4757-45f1-b92d-affdf17f3a09" (UID: "e29d9331-4757-45f1-b92d-affdf17f3a09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.843889 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e29d9331-4757-45f1-b92d-affdf17f3a09" (UID: "e29d9331-4757-45f1-b92d-affdf17f3a09"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.855987 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-config-data" (OuterVolumeSpecName: "config-data") pod "e29d9331-4757-45f1-b92d-affdf17f3a09" (UID: "e29d9331-4757-45f1-b92d-affdf17f3a09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.863361 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.863396 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.863408 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.863448 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.863461 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq97l\" (UniqueName: \"kubernetes.io/projected/e29d9331-4757-45f1-b92d-affdf17f3a09-kube-api-access-gq97l\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.863474 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29d9331-4757-45f1-b92d-affdf17f3a09-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.888875 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 08 20:23:44 crc kubenswrapper[4781]: I1208 20:23:44.964789 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.137422 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e29d9331-4757-45f1-b92d-affdf17f3a09","Type":"ContainerDied","Data":"baeb86e64f9641c759989125361156e75efcb95328dd8668d84c360b59c4511f"} Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.137472 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.137484 4781 scope.go:117] "RemoveContainer" containerID="56a45b606f103f794d5e228ec39cacb874cd460f19fe69d24523a210e4f8cbdf" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.175176 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.191546 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.200455 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:23:45 crc kubenswrapper[4781]: E1208 20:23:45.200902 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29d9331-4757-45f1-b92d-affdf17f3a09" containerName="glance-httpd" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.200940 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29d9331-4757-45f1-b92d-affdf17f3a09" containerName="glance-httpd" Dec 08 20:23:45 crc kubenswrapper[4781]: E1208 20:23:45.200979 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29d9331-4757-45f1-b92d-affdf17f3a09" containerName="glance-log" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.200985 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29d9331-4757-45f1-b92d-affdf17f3a09" containerName="glance-log" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.201166 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29d9331-4757-45f1-b92d-affdf17f3a09" containerName="glance-httpd" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.201195 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29d9331-4757-45f1-b92d-affdf17f3a09" containerName="glance-log" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.202344 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.204931 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.205352 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.209763 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.376256 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.376382 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-scripts\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.376404 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-logs\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.376513 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xss8\" (UniqueName: \"kubernetes.io/projected/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-kube-api-access-2xss8\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.376586 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-config-data\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.376634 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.376785 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.376816 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.478214 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-scripts\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.478259 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-logs\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.478306 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xss8\" (UniqueName: \"kubernetes.io/projected/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-kube-api-access-2xss8\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.478346 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-config-data\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.478380 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.478435 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.478461 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.478485 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.479392 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.479633 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-logs\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.479751 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.484210 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-config-data\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.486554 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.486897 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.487517 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-scripts\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.495583 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xss8\" (UniqueName: \"kubernetes.io/projected/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-kube-api-access-2xss8\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.516424 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " pod="openstack/glance-default-external-api-0" Dec 08 20:23:45 crc kubenswrapper[4781]: I1208 20:23:45.525906 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 20:23:46 crc kubenswrapper[4781]: I1208 20:23:46.139162 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e29d9331-4757-45f1-b92d-affdf17f3a09" path="/var/lib/kubelet/pods/e29d9331-4757-45f1-b92d-affdf17f3a09/volumes" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.206401 4781 generic.go:334] "Generic (PLEG): container finished" podID="ac6c7297-a2fe-4569-b0a2-a0a2329df115" containerID="d859300ce3f58859e8c5cf261bb796ad5a7b29f7f6825fab7f30e4a129604b9e" exitCode=0 Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.206509 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5ls9p" event={"ID":"ac6c7297-a2fe-4569-b0a2-a0a2329df115","Type":"ContainerDied","Data":"d859300ce3f58859e8c5cf261bb796ad5a7b29f7f6825fab7f30e4a129604b9e"} Dec 08 20:23:53 crc kubenswrapper[4781]: E1208 20:23:53.595811 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f" Dec 08 20:23:53 crc kubenswrapper[4781]: E1208 20:23:53.596263 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cch5c8h5f7h647h65bh68bh5c8h85h686h589h555h54ch655h698h6bh574h665h5f8h7dhcchb7h59h5bbh87h57chdh5b7h68h5dh5b7h9h645q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25gjk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-76b8bd6cbf-rvflb_openstack(75ebf1a9-601e-4127-afcd-641216ae3a11): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:23:53 crc kubenswrapper[4781]: E1208 20:23:53.599065 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f\\\"\"]" pod="openstack/horizon-76b8bd6cbf-rvflb" podUID="75ebf1a9-601e-4127-afcd-641216ae3a11" Dec 08 20:23:53 crc kubenswrapper[4781]: E1208 20:23:53.609416 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f" Dec 08 20:23:53 crc kubenswrapper[4781]: E1208 20:23:53.609595 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n645hf7h54dh547hdbh5cdh5cch54bh65dh647h8ch8chdchb5h5bch56dh575h67bhc8h568h588h56ch5c7h645h557h9h66chcbh64h598h64h54q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qzrdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-747cc54cb5-cgsnc_openstack(40fdaf60-ff95-49e7-9c00-622d211a969b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:23:53 crc kubenswrapper[4781]: E1208 20:23:53.612321 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:4f4b273dd4c6ead9bf640fb985d101a7c9adba388968fb1d71fbb08b0510eb9f\\\"\"]" pod="openstack/horizon-747cc54cb5-cgsnc" podUID="40fdaf60-ff95-49e7-9c00-622d211a969b" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.666962 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.673133 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836394 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-scripts\") pod \"9d1074ff-6031-42ce-85b7-acfed9dad04d\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836430 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-combined-ca-bundle\") pod \"9d1074ff-6031-42ce-85b7-acfed9dad04d\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836448 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-fernet-keys\") pod \"9d1074ff-6031-42ce-85b7-acfed9dad04d\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836464 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-combined-ca-bundle\") pod \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836487 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-config-data\") pod \"9d1074ff-6031-42ce-85b7-acfed9dad04d\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836507 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-internal-tls-certs\") pod \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836528 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-logs\") pod \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836558 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl4zf\" (UniqueName: \"kubernetes.io/projected/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-kube-api-access-jl4zf\") pod \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836582 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-httpd-run\") pod \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836690 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-credential-keys\") pod \"9d1074ff-6031-42ce-85b7-acfed9dad04d\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836728 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836748 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtgdj\" (UniqueName: \"kubernetes.io/projected/9d1074ff-6031-42ce-85b7-acfed9dad04d-kube-api-access-xtgdj\") pod \"9d1074ff-6031-42ce-85b7-acfed9dad04d\" (UID: \"9d1074ff-6031-42ce-85b7-acfed9dad04d\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836835 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-config-data\") pod \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.836857 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-scripts\") pod \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\" (UID: \"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70\") " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.838299 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-logs" (OuterVolumeSpecName: "logs") pod "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" (UID: "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.838698 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" (UID: "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.843178 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-scripts" (OuterVolumeSpecName: "scripts") pod "9d1074ff-6031-42ce-85b7-acfed9dad04d" (UID: "9d1074ff-6031-42ce-85b7-acfed9dad04d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.843319 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-kube-api-access-jl4zf" (OuterVolumeSpecName: "kube-api-access-jl4zf") pod "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" (UID: "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70"). InnerVolumeSpecName "kube-api-access-jl4zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.861165 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9d1074ff-6031-42ce-85b7-acfed9dad04d" (UID: "9d1074ff-6031-42ce-85b7-acfed9dad04d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.861908 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" (UID: "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.863758 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-scripts" (OuterVolumeSpecName: "scripts") pod "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" (UID: "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.864633 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9d1074ff-6031-42ce-85b7-acfed9dad04d" (UID: "9d1074ff-6031-42ce-85b7-acfed9dad04d"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.867646 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9d1074ff-6031-42ce-85b7-acfed9dad04d" (UID: "9d1074ff-6031-42ce-85b7-acfed9dad04d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.867664 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" (UID: "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.869498 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1074ff-6031-42ce-85b7-acfed9dad04d-kube-api-access-xtgdj" (OuterVolumeSpecName: "kube-api-access-xtgdj") pod "9d1074ff-6031-42ce-85b7-acfed9dad04d" (UID: "9d1074ff-6031-42ce-85b7-acfed9dad04d"). InnerVolumeSpecName "kube-api-access-xtgdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.870856 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-config-data" (OuterVolumeSpecName: "config-data") pod "9d1074ff-6031-42ce-85b7-acfed9dad04d" (UID: "9d1074ff-6031-42ce-85b7-acfed9dad04d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.897563 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-config-data" (OuterVolumeSpecName: "config-data") pod "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" (UID: "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.911320 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" (UID: "9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939054 4781 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939110 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939126 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtgdj\" (UniqueName: \"kubernetes.io/projected/9d1074ff-6031-42ce-85b7-acfed9dad04d-kube-api-access-xtgdj\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939140 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939151 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939161 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939171 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939183 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939194 4781 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939203 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d1074ff-6031-42ce-85b7-acfed9dad04d-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939213 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939223 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939233 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl4zf\" (UniqueName: \"kubernetes.io/projected/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-kube-api-access-jl4zf\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.939242 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:53 crc kubenswrapper[4781]: I1208 20:23:53.959381 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.040522 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.216252 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lnhsl" event={"ID":"9d1074ff-6031-42ce-85b7-acfed9dad04d","Type":"ContainerDied","Data":"2485280c0290dff409b5573ed440e3d66e276a8ad6aae72a28d9678056baf473"} Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.216542 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2485280c0290dff409b5573ed440e3d66e276a8ad6aae72a28d9678056baf473" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.216300 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lnhsl" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.218821 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70","Type":"ContainerDied","Data":"777070978b62f8972f29e572c19efd010f716e1e640df2b0e90e06e21f5b9577"} Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.218943 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.274000 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d879466b9-tx84q" podUID="1b14a49e-1bf0-4914-9625-14c58c351d6a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.296683 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.316095 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.325709 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:23:54 crc kubenswrapper[4781]: E1208 20:23:54.341543 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" containerName="glance-log" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.341564 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" containerName="glance-log" Dec 08 20:23:54 crc kubenswrapper[4781]: E1208 20:23:54.341594 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1074ff-6031-42ce-85b7-acfed9dad04d" containerName="keystone-bootstrap" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.341602 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1074ff-6031-42ce-85b7-acfed9dad04d" containerName="keystone-bootstrap" Dec 08 20:23:54 crc kubenswrapper[4781]: E1208 20:23:54.341612 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" containerName="glance-httpd" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.341618 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" containerName="glance-httpd" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.341776 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" containerName="glance-httpd" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.341796 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" containerName="glance-log" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.341812 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d1074ff-6031-42ce-85b7-acfed9dad04d" containerName="keystone-bootstrap" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.342631 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.342714 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.355391 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.355804 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.458843 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhdhz\" (UniqueName: \"kubernetes.io/projected/34bc4529-0766-4d9b-ad5c-5b78604ebb10-kube-api-access-lhdhz\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.458953 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34bc4529-0766-4d9b-ad5c-5b78604ebb10-logs\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.458989 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34bc4529-0766-4d9b-ad5c-5b78604ebb10-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.459010 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.459040 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.459076 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.459107 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.459408 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.561656 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.561723 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.561784 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.561858 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhdhz\" (UniqueName: \"kubernetes.io/projected/34bc4529-0766-4d9b-ad5c-5b78604ebb10-kube-api-access-lhdhz\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.561909 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34bc4529-0766-4d9b-ad5c-5b78604ebb10-logs\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.561992 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34bc4529-0766-4d9b-ad5c-5b78604ebb10-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.562014 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.562041 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.562681 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.570776 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34bc4529-0766-4d9b-ad5c-5b78604ebb10-logs\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.576012 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.577218 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.577326 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34bc4529-0766-4d9b-ad5c-5b78604ebb10-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.578553 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.579389 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.589247 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhdhz\" (UniqueName: \"kubernetes.io/projected/34bc4529-0766-4d9b-ad5c-5b78604ebb10-kube-api-access-lhdhz\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.618407 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.682650 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.791331 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lnhsl"] Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.799839 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lnhsl"] Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.884982 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-p8pdd"] Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.886295 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.888248 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.888466 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.894273 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.894484 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4bstc" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.895150 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 08 20:23:54 crc kubenswrapper[4781]: I1208 20:23:54.897609 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p8pdd"] Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.076339 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wth5g\" (UniqueName: \"kubernetes.io/projected/ec5cb8d0-f612-43f8-a3c2-27953b92735b-kube-api-access-wth5g\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.076387 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-config-data\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.076644 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-fernet-keys\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.076763 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-credential-keys\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.076839 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-scripts\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.076905 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-combined-ca-bundle\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.178687 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-scripts\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.178763 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-combined-ca-bundle\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.178814 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wth5g\" (UniqueName: \"kubernetes.io/projected/ec5cb8d0-f612-43f8-a3c2-27953b92735b-kube-api-access-wth5g\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.178830 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-config-data\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.178875 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-fernet-keys\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.178911 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-credential-keys\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.183805 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-fernet-keys\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.185111 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-scripts\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.186364 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-credential-keys\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.186795 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-combined-ca-bundle\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.187757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-config-data\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.196895 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wth5g\" (UniqueName: \"kubernetes.io/projected/ec5cb8d0-f612-43f8-a3c2-27953b92735b-kube-api-access-wth5g\") pod \"keystone-bootstrap-p8pdd\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.204024 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.674868 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.788292 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-ovsdbserver-nb\") pod \"1b14a49e-1bf0-4914-9625-14c58c351d6a\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.788607 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbtjj\" (UniqueName: \"kubernetes.io/projected/1b14a49e-1bf0-4914-9625-14c58c351d6a-kube-api-access-xbtjj\") pod \"1b14a49e-1bf0-4914-9625-14c58c351d6a\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.788713 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-config\") pod \"1b14a49e-1bf0-4914-9625-14c58c351d6a\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.788765 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-ovsdbserver-sb\") pod \"1b14a49e-1bf0-4914-9625-14c58c351d6a\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.788827 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-dns-swift-storage-0\") pod \"1b14a49e-1bf0-4914-9625-14c58c351d6a\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.788866 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-dns-svc\") pod \"1b14a49e-1bf0-4914-9625-14c58c351d6a\" (UID: \"1b14a49e-1bf0-4914-9625-14c58c351d6a\") " Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.795341 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b14a49e-1bf0-4914-9625-14c58c351d6a-kube-api-access-xbtjj" (OuterVolumeSpecName: "kube-api-access-xbtjj") pod "1b14a49e-1bf0-4914-9625-14c58c351d6a" (UID: "1b14a49e-1bf0-4914-9625-14c58c351d6a"). InnerVolumeSpecName "kube-api-access-xbtjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.834020 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-config" (OuterVolumeSpecName: "config") pod "1b14a49e-1bf0-4914-9625-14c58c351d6a" (UID: "1b14a49e-1bf0-4914-9625-14c58c351d6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.839540 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b14a49e-1bf0-4914-9625-14c58c351d6a" (UID: "1b14a49e-1bf0-4914-9625-14c58c351d6a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.839707 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1b14a49e-1bf0-4914-9625-14c58c351d6a" (UID: "1b14a49e-1bf0-4914-9625-14c58c351d6a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.842296 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b14a49e-1bf0-4914-9625-14c58c351d6a" (UID: "1b14a49e-1bf0-4914-9625-14c58c351d6a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.861576 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b14a49e-1bf0-4914-9625-14c58c351d6a" (UID: "1b14a49e-1bf0-4914-9625-14c58c351d6a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.891581 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.891618 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.891633 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.891647 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.891658 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b14a49e-1bf0-4914-9625-14c58c351d6a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.891668 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbtjj\" (UniqueName: \"kubernetes.io/projected/1b14a49e-1bf0-4914-9625-14c58c351d6a-kube-api-access-xbtjj\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:55 crc kubenswrapper[4781]: I1208 20:23:55.993485 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66bd65f4cd-rdlc7"] Dec 08 20:23:56 crc kubenswrapper[4781]: I1208 20:23:56.136572 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70" path="/var/lib/kubelet/pods/9cf3ad3d-1ea0-4fa3-ac7e-ee56ee398f70/volumes" Dec 08 20:23:56 crc kubenswrapper[4781]: I1208 20:23:56.137390 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d1074ff-6031-42ce-85b7-acfed9dad04d" path="/var/lib/kubelet/pods/9d1074ff-6031-42ce-85b7-acfed9dad04d/volumes" Dec 08 20:23:56 crc kubenswrapper[4781]: I1208 20:23:56.235789 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d879466b9-tx84q" event={"ID":"1b14a49e-1bf0-4914-9625-14c58c351d6a","Type":"ContainerDied","Data":"2f2c4c073898332e80f6f0d1db091b2d6746e3a72dd86bd06af0f6798f26109f"} Dec 08 20:23:56 crc kubenswrapper[4781]: I1208 20:23:56.235870 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d879466b9-tx84q" Dec 08 20:23:56 crc kubenswrapper[4781]: I1208 20:23:56.264261 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d879466b9-tx84q"] Dec 08 20:23:56 crc kubenswrapper[4781]: I1208 20:23:56.273314 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d879466b9-tx84q"] Dec 08 20:23:56 crc kubenswrapper[4781]: E1208 20:23:56.731510 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2" Dec 08 20:23:56 crc kubenswrapper[4781]: E1208 20:23:56.731676 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5q8km,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-75vwt_openstack(499d0466-ecbe-4866-b516-3c778c16ec94): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:23:56 crc kubenswrapper[4781]: E1208 20:23:56.732851 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-75vwt" podUID="499d0466-ecbe-4866-b516-3c778c16ec94" Dec 08 20:23:57 crc kubenswrapper[4781]: E1208 20:23:57.195635 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f" Dec 08 20:23:57 crc kubenswrapper[4781]: E1208 20:23:57.196393 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tjgtj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-9wz9p_openstack(279c45d3-ece4-42b6-8968-90806c171bf9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:23:57 crc kubenswrapper[4781]: E1208 20:23:57.197821 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-9wz9p" podUID="279c45d3-ece4-42b6-8968-90806c171bf9" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.250822 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5ls9p" event={"ID":"ac6c7297-a2fe-4569-b0a2-a0a2329df115","Type":"ContainerDied","Data":"4c8b74eb159c2b8407f26753d5466c00fce9d87d054dbdc41ff01f4544b3d9e0"} Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.250899 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c8b74eb159c2b8407f26753d5466c00fce9d87d054dbdc41ff01f4544b3d9e0" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.252851 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76b8bd6cbf-rvflb" event={"ID":"75ebf1a9-601e-4127-afcd-641216ae3a11","Type":"ContainerDied","Data":"44433e51799023a2926b6443fc1c14a5ee28ecaf2c203d113a46ca8f4cc35b09"} Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.252881 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44433e51799023a2926b6443fc1c14a5ee28ecaf2c203d113a46ca8f4cc35b09" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.254483 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-747cc54cb5-cgsnc" event={"ID":"40fdaf60-ff95-49e7-9c00-622d211a969b","Type":"ContainerDied","Data":"4be2c2ba78430c1491982524a6a94677a585c292356b299143f9c1ec9bcd2b37"} Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.254510 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4be2c2ba78430c1491982524a6a94677a585c292356b299143f9c1ec9bcd2b37" Dec 08 20:23:57 crc kubenswrapper[4781]: E1208 20:23:57.256610 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2\\\"\"" pod="openstack/cinder-db-sync-75vwt" podUID="499d0466-ecbe-4866-b516-3c778c16ec94" Dec 08 20:23:57 crc kubenswrapper[4781]: E1208 20:23:57.257304 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f\\\"\"" pod="openstack/barbican-db-sync-9wz9p" podUID="279c45d3-ece4-42b6-8968-90806c171bf9" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.299206 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5ls9p" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.337546 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.352032 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.416653 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ac6c7297-a2fe-4569-b0a2-a0a2329df115-config\") pod \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\" (UID: \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\") " Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.416695 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6c7297-a2fe-4569-b0a2-a0a2329df115-combined-ca-bundle\") pod \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\" (UID: \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\") " Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.416811 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl9tk\" (UniqueName: \"kubernetes.io/projected/ac6c7297-a2fe-4569-b0a2-a0a2329df115-kube-api-access-fl9tk\") pod \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\" (UID: \"ac6c7297-a2fe-4569-b0a2-a0a2329df115\") " Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.422598 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac6c7297-a2fe-4569-b0a2-a0a2329df115-kube-api-access-fl9tk" (OuterVolumeSpecName: "kube-api-access-fl9tk") pod "ac6c7297-a2fe-4569-b0a2-a0a2329df115" (UID: "ac6c7297-a2fe-4569-b0a2-a0a2329df115"). InnerVolumeSpecName "kube-api-access-fl9tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.441021 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6c7297-a2fe-4569-b0a2-a0a2329df115-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac6c7297-a2fe-4569-b0a2-a0a2329df115" (UID: "ac6c7297-a2fe-4569-b0a2-a0a2329df115"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.443442 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6c7297-a2fe-4569-b0a2-a0a2329df115-config" (OuterVolumeSpecName: "config") pod "ac6c7297-a2fe-4569-b0a2-a0a2329df115" (UID: "ac6c7297-a2fe-4569-b0a2-a0a2329df115"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.519755 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75ebf1a9-601e-4127-afcd-641216ae3a11-config-data\") pod \"75ebf1a9-601e-4127-afcd-641216ae3a11\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.519798 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzrdj\" (UniqueName: \"kubernetes.io/projected/40fdaf60-ff95-49e7-9c00-622d211a969b-kube-api-access-qzrdj\") pod \"40fdaf60-ff95-49e7-9c00-622d211a969b\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.519848 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25gjk\" (UniqueName: \"kubernetes.io/projected/75ebf1a9-601e-4127-afcd-641216ae3a11-kube-api-access-25gjk\") pod \"75ebf1a9-601e-4127-afcd-641216ae3a11\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.519901 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75ebf1a9-601e-4127-afcd-641216ae3a11-scripts\") pod \"75ebf1a9-601e-4127-afcd-641216ae3a11\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.519962 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40fdaf60-ff95-49e7-9c00-622d211a969b-scripts\") pod \"40fdaf60-ff95-49e7-9c00-622d211a969b\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.520103 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40fdaf60-ff95-49e7-9c00-622d211a969b-logs\") pod \"40fdaf60-ff95-49e7-9c00-622d211a969b\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.520142 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/75ebf1a9-601e-4127-afcd-641216ae3a11-horizon-secret-key\") pod \"75ebf1a9-601e-4127-afcd-641216ae3a11\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.520186 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40fdaf60-ff95-49e7-9c00-622d211a969b-config-data\") pod \"40fdaf60-ff95-49e7-9c00-622d211a969b\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.520240 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40fdaf60-ff95-49e7-9c00-622d211a969b-horizon-secret-key\") pod \"40fdaf60-ff95-49e7-9c00-622d211a969b\" (UID: \"40fdaf60-ff95-49e7-9c00-622d211a969b\") " Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.520282 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ebf1a9-601e-4127-afcd-641216ae3a11-logs\") pod \"75ebf1a9-601e-4127-afcd-641216ae3a11\" (UID: \"75ebf1a9-601e-4127-afcd-641216ae3a11\") " Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.520721 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40fdaf60-ff95-49e7-9c00-622d211a969b-logs" (OuterVolumeSpecName: "logs") pod "40fdaf60-ff95-49e7-9c00-622d211a969b" (UID: "40fdaf60-ff95-49e7-9c00-622d211a969b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.520787 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40fdaf60-ff95-49e7-9c00-622d211a969b-scripts" (OuterVolumeSpecName: "scripts") pod "40fdaf60-ff95-49e7-9c00-622d211a969b" (UID: "40fdaf60-ff95-49e7-9c00-622d211a969b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.521002 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75ebf1a9-601e-4127-afcd-641216ae3a11-scripts" (OuterVolumeSpecName: "scripts") pod "75ebf1a9-601e-4127-afcd-641216ae3a11" (UID: "75ebf1a9-601e-4127-afcd-641216ae3a11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.521018 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75ebf1a9-601e-4127-afcd-641216ae3a11-logs" (OuterVolumeSpecName: "logs") pod "75ebf1a9-601e-4127-afcd-641216ae3a11" (UID: "75ebf1a9-601e-4127-afcd-641216ae3a11"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.521356 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40fdaf60-ff95-49e7-9c00-622d211a969b-config-data" (OuterVolumeSpecName: "config-data") pod "40fdaf60-ff95-49e7-9c00-622d211a969b" (UID: "40fdaf60-ff95-49e7-9c00-622d211a969b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.521397 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75ebf1a9-601e-4127-afcd-641216ae3a11-config-data" (OuterVolumeSpecName: "config-data") pod "75ebf1a9-601e-4127-afcd-641216ae3a11" (UID: "75ebf1a9-601e-4127-afcd-641216ae3a11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.521851 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40fdaf60-ff95-49e7-9c00-622d211a969b-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.522231 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ac6c7297-a2fe-4569-b0a2-a0a2329df115-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.522298 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6c7297-a2fe-4569-b0a2-a0a2329df115-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.522362 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40fdaf60-ff95-49e7-9c00-622d211a969b-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.522412 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl9tk\" (UniqueName: \"kubernetes.io/projected/ac6c7297-a2fe-4569-b0a2-a0a2329df115-kube-api-access-fl9tk\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.522465 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/40fdaf60-ff95-49e7-9c00-622d211a969b-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.522514 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75ebf1a9-601e-4127-afcd-641216ae3a11-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.522564 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/75ebf1a9-601e-4127-afcd-641216ae3a11-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.522611 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/75ebf1a9-601e-4127-afcd-641216ae3a11-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.522948 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75ebf1a9-601e-4127-afcd-641216ae3a11-kube-api-access-25gjk" (OuterVolumeSpecName: "kube-api-access-25gjk") pod "75ebf1a9-601e-4127-afcd-641216ae3a11" (UID: "75ebf1a9-601e-4127-afcd-641216ae3a11"). InnerVolumeSpecName "kube-api-access-25gjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.523396 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75ebf1a9-601e-4127-afcd-641216ae3a11-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "75ebf1a9-601e-4127-afcd-641216ae3a11" (UID: "75ebf1a9-601e-4127-afcd-641216ae3a11"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.524656 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40fdaf60-ff95-49e7-9c00-622d211a969b-kube-api-access-qzrdj" (OuterVolumeSpecName: "kube-api-access-qzrdj") pod "40fdaf60-ff95-49e7-9c00-622d211a969b" (UID: "40fdaf60-ff95-49e7-9c00-622d211a969b"). InnerVolumeSpecName "kube-api-access-qzrdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.524960 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40fdaf60-ff95-49e7-9c00-622d211a969b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "40fdaf60-ff95-49e7-9c00-622d211a969b" (UID: "40fdaf60-ff95-49e7-9c00-622d211a969b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:23:57 crc kubenswrapper[4781]: E1208 20:23:57.588657 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:43a24796dabde68270dbfefa107205e173fdd6a0dc701502858cadbede69da31" Dec 08 20:23:57 crc kubenswrapper[4781]: E1208 20:23:57.588817 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:43a24796dabde68270dbfefa107205e173fdd6a0dc701502858cadbede69da31,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n564h8h677h678hbh584h584h9bhc5h5bdh5dfh5ffhc9h669h667hfch76h588h5dh66dh56dh5d9h55dh5cdh565h576h576h568h66ch8dh5b7h5cfq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gb9fz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(da20c49c-5d8d-4701-a54b-c6ae7b6670db): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.604766 4781 scope.go:117] "RemoveContainer" containerID="f058f0d5b3a17571a9cad81a1f1b9e051b85d6250f26b884666f25483a49a5c5" Dec 08 20:23:57 crc kubenswrapper[4781]: W1208 20:23:57.610545 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b714ffd_7c31_434d_a833_04abe6c8dcfb.slice/crio-9ba381cea379a839b22c5acc1351e225254ba61d93785dd53b1a15e8158c703f WatchSource:0}: Error finding container 9ba381cea379a839b22c5acc1351e225254ba61d93785dd53b1a15e8158c703f: Status 404 returned error can't find the container with id 9ba381cea379a839b22c5acc1351e225254ba61d93785dd53b1a15e8158c703f Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.629222 4781 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/75ebf1a9-601e-4127-afcd-641216ae3a11-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.629276 4781 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/40fdaf60-ff95-49e7-9c00-622d211a969b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.629296 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzrdj\" (UniqueName: \"kubernetes.io/projected/40fdaf60-ff95-49e7-9c00-622d211a969b-kube-api-access-qzrdj\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.629319 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25gjk\" (UniqueName: \"kubernetes.io/projected/75ebf1a9-601e-4127-afcd-641216ae3a11-kube-api-access-25gjk\") on node \"crc\" DevicePath \"\"" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.660831 4781 scope.go:117] "RemoveContainer" containerID="6528f5f6e505b33293a5f68a521362a16d1e4a97cb27ce5ef64333977e533f83" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.717498 4781 scope.go:117] "RemoveContainer" containerID="e17c07cbd65814e184dcb385361322a3da697853b6f5a2d6efdd5af7aca86ca0" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.813569 4781 scope.go:117] "RemoveContainer" containerID="5f7d25fcf036d3edffa6034a78fbd156419592ca8440b9ff8c70a8e1591f570e" Dec 08 20:23:57 crc kubenswrapper[4781]: I1208 20:23:57.879113 4781 scope.go:117] "RemoveContainer" containerID="f99257e52a618c9d6669a0a855d33f2421c2cb0209bb00953b89a3073be5f944" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.139907 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b14a49e-1bf0-4914-9625-14c58c351d6a" path="/var/lib/kubelet/pods/1b14a49e-1bf0-4914-9625-14c58c351d6a/volumes" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.183153 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75dd9f77c4-85lhw"] Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.193532 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p8pdd"] Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.203260 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.268262 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373","Type":"ContainerStarted","Data":"4e904a519fd92f06fab30d2bfc6966cc1aebb3a5a46d6d1d70ab5dcaa1700315"} Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.279356 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594cfd6f5c-pppkp" event={"ID":"965ff155-f5c6-4c49-9d46-18a62ef94308","Type":"ContainerStarted","Data":"63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd"} Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.282420 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bnfg8" event={"ID":"c4245549-33f8-4a0e-a17e-08417bed869c","Type":"ContainerStarted","Data":"14fa401579550a4ae87b42d8db67cadeb1144d67d296365a254283ce4a0dc75f"} Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.284212 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.287274 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75dd9f77c4-85lhw" event={"ID":"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf","Type":"ContainerStarted","Data":"8ce9c155d772f9babceadc1ea0a6e1df6d1f722a61cf7a02418a8a4f5552d388"} Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.289264 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66bd65f4cd-rdlc7" event={"ID":"2b714ffd-7c31-434d-a833-04abe6c8dcfb","Type":"ContainerStarted","Data":"458aad91b280fae6c403fbd40963945d0173a54e50791aa99b5456b5962d7f8c"} Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.289298 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66bd65f4cd-rdlc7" event={"ID":"2b714ffd-7c31-434d-a833-04abe6c8dcfb","Type":"ContainerStarted","Data":"9ba381cea379a839b22c5acc1351e225254ba61d93785dd53b1a15e8158c703f"} Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.290843 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p8pdd" event={"ID":"ec5cb8d0-f612-43f8-a3c2-27953b92735b","Type":"ContainerStarted","Data":"ef47bdc5af9c22a73147321a01cafa901a4e9aa04de971ed9a0d6b7848136b06"} Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.293037 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5ls9p" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.293301 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76b8bd6cbf-rvflb" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.293366 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-747cc54cb5-cgsnc" Dec 08 20:23:58 crc kubenswrapper[4781]: W1208 20:23:58.297248 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34bc4529_0766_4d9b_ad5c_5b78604ebb10.slice/crio-486ae9c6afe76de2d63c49388f13fa87f85bc61367d5d5d6f9ffa370ccb98026 WatchSource:0}: Error finding container 486ae9c6afe76de2d63c49388f13fa87f85bc61367d5d5d6f9ffa370ccb98026: Status 404 returned error can't find the container with id 486ae9c6afe76de2d63c49388f13fa87f85bc61367d5d5d6f9ffa370ccb98026 Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.302672 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-bnfg8" podStartSLOduration=5.436449759 podStartE2EDuration="28.302657486s" podCreationTimestamp="2025-12-08 20:23:30 +0000 UTC" firstStartedPulling="2025-12-08 20:23:33.808852695 +0000 UTC m=+1129.960136072" lastFinishedPulling="2025-12-08 20:23:56.675060422 +0000 UTC m=+1152.826343799" observedRunningTime="2025-12-08 20:23:58.301097101 +0000 UTC m=+1154.452380478" watchObservedRunningTime="2025-12-08 20:23:58.302657486 +0000 UTC m=+1154.453940863" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.371274 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76b8bd6cbf-rvflb"] Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.398964 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76b8bd6cbf-rvflb"] Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.442274 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-747cc54cb5-cgsnc"] Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.449758 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-747cc54cb5-cgsnc"] Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.577245 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-pmqdz"] Dec 08 20:23:58 crc kubenswrapper[4781]: E1208 20:23:58.577752 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b14a49e-1bf0-4914-9625-14c58c351d6a" containerName="init" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.577770 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b14a49e-1bf0-4914-9625-14c58c351d6a" containerName="init" Dec 08 20:23:58 crc kubenswrapper[4781]: E1208 20:23:58.577799 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6c7297-a2fe-4569-b0a2-a0a2329df115" containerName="neutron-db-sync" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.577805 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6c7297-a2fe-4569-b0a2-a0a2329df115" containerName="neutron-db-sync" Dec 08 20:23:58 crc kubenswrapper[4781]: E1208 20:23:58.577822 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b14a49e-1bf0-4914-9625-14c58c351d6a" containerName="dnsmasq-dns" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.577828 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b14a49e-1bf0-4914-9625-14c58c351d6a" containerName="dnsmasq-dns" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.578002 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6c7297-a2fe-4569-b0a2-a0a2329df115" containerName="neutron-db-sync" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.578029 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b14a49e-1bf0-4914-9625-14c58c351d6a" containerName="dnsmasq-dns" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.579023 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.604903 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-pmqdz"] Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.750974 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-86dcc84d-7tchk"] Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.752012 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-config\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.752078 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-dns-swift-storage-0\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.752144 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49vjw\" (UniqueName: \"kubernetes.io/projected/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-kube-api-access-49vjw\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.752244 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-dns-svc\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.752291 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-ovsdbserver-nb\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.752365 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-ovsdbserver-sb\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.752388 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.754165 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.756039 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cdfmn" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.756312 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.756486 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.756632 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86dcc84d-7tchk"] Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.854199 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-dns-swift-storage-0\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.854579 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49vjw\" (UniqueName: \"kubernetes.io/projected/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-kube-api-access-49vjw\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.854610 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-ovndb-tls-certs\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.854653 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-combined-ca-bundle\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.854703 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-httpd-config\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.854735 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-dns-svc\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.854769 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-ovsdbserver-nb\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.854800 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-config\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.854834 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q94vl\" (UniqueName: \"kubernetes.io/projected/0c055aae-fa31-4d84-aa1b-b60c8829a61b-kube-api-access-q94vl\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.854871 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-ovsdbserver-sb\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.854906 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-config\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.855886 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-config\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.856306 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-dns-swift-storage-0\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.856738 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-ovsdbserver-sb\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.857724 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-ovsdbserver-nb\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.858291 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-dns-svc\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.884752 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49vjw\" (UniqueName: \"kubernetes.io/projected/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-kube-api-access-49vjw\") pod \"dnsmasq-dns-849ff95dc5-pmqdz\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.913826 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.956194 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q94vl\" (UniqueName: \"kubernetes.io/projected/0c055aae-fa31-4d84-aa1b-b60c8829a61b-kube-api-access-q94vl\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.956307 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-ovndb-tls-certs\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.956356 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-combined-ca-bundle\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.956395 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-httpd-config\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.956457 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-config\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.960853 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-ovndb-tls-certs\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.962261 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-combined-ca-bundle\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.966038 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-config\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.966708 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-httpd-config\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:58 crc kubenswrapper[4781]: I1208 20:23:58.992139 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q94vl\" (UniqueName: \"kubernetes.io/projected/0c055aae-fa31-4d84-aa1b-b60c8829a61b-kube-api-access-q94vl\") pod \"neutron-86dcc84d-7tchk\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.125381 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.276059 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-d879466b9-tx84q" podUID="1b14a49e-1bf0-4914-9625-14c58c351d6a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.332857 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34bc4529-0766-4d9b-ad5c-5b78604ebb10","Type":"ContainerStarted","Data":"486ae9c6afe76de2d63c49388f13fa87f85bc61367d5d5d6f9ffa370ccb98026"} Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.334744 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594cfd6f5c-pppkp" event={"ID":"965ff155-f5c6-4c49-9d46-18a62ef94308","Type":"ContainerStarted","Data":"c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc"} Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.335117 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-594cfd6f5c-pppkp" podUID="965ff155-f5c6-4c49-9d46-18a62ef94308" containerName="horizon-log" containerID="cri-o://63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd" gracePeriod=30 Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.335316 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-594cfd6f5c-pppkp" podUID="965ff155-f5c6-4c49-9d46-18a62ef94308" containerName="horizon" containerID="cri-o://c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc" gracePeriod=30 Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.363899 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75dd9f77c4-85lhw" event={"ID":"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf","Type":"ContainerStarted","Data":"a13361e00f73faaf706d48c6615dd04e5343d1bfea9f6a7ef796f08e0a0306fd"} Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.363961 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75dd9f77c4-85lhw" event={"ID":"eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf","Type":"ContainerStarted","Data":"cd202b477d0390512a3e374a60297b7d008159e0523bc6806d9c8d55acac61e0"} Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.365909 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-594cfd6f5c-pppkp" podStartSLOduration=5.452882402 podStartE2EDuration="29.365885532s" podCreationTimestamp="2025-12-08 20:23:30 +0000 UTC" firstStartedPulling="2025-12-08 20:23:33.71430066 +0000 UTC m=+1129.865584037" lastFinishedPulling="2025-12-08 20:23:57.62730379 +0000 UTC m=+1153.778587167" observedRunningTime="2025-12-08 20:23:59.363786862 +0000 UTC m=+1155.515070239" watchObservedRunningTime="2025-12-08 20:23:59.365885532 +0000 UTC m=+1155.517168909" Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.392320 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66bd65f4cd-rdlc7" event={"ID":"2b714ffd-7c31-434d-a833-04abe6c8dcfb","Type":"ContainerStarted","Data":"02389c5659f9a03f3983d67ab739b10de654ad7c39804e031b67fa0e1b85821a"} Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.428787 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-75dd9f77c4-85lhw" podStartSLOduration=20.428767848 podStartE2EDuration="20.428767848s" podCreationTimestamp="2025-12-08 20:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:23:59.410359609 +0000 UTC m=+1155.561642986" watchObservedRunningTime="2025-12-08 20:23:59.428767848 +0000 UTC m=+1155.580051225" Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.429809 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p8pdd" event={"ID":"ec5cb8d0-f612-43f8-a3c2-27953b92735b","Type":"ContainerStarted","Data":"f54fda88e581234a7d6da66f0aca5fed2f7c097886ea21329647c89526a4ae15"} Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.552791 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-66bd65f4cd-rdlc7" podStartSLOduration=20.552773869 podStartE2EDuration="20.552773869s" podCreationTimestamp="2025-12-08 20:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:23:59.452444918 +0000 UTC m=+1155.603728295" watchObservedRunningTime="2025-12-08 20:23:59.552773869 +0000 UTC m=+1155.704057246" Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.572236 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-p8pdd" podStartSLOduration=5.572215688 podStartE2EDuration="5.572215688s" podCreationTimestamp="2025-12-08 20:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:23:59.54827349 +0000 UTC m=+1155.699556857" watchObservedRunningTime="2025-12-08 20:23:59.572215688 +0000 UTC m=+1155.723499065" Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.602457 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-pmqdz"] Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.926516 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.926951 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:23:59 crc kubenswrapper[4781]: I1208 20:23:59.979662 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86dcc84d-7tchk"] Dec 08 20:24:00 crc kubenswrapper[4781]: I1208 20:24:00.040526 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:24:00 crc kubenswrapper[4781]: I1208 20:24:00.040586 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:24:00 crc kubenswrapper[4781]: I1208 20:24:00.141741 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40fdaf60-ff95-49e7-9c00-622d211a969b" path="/var/lib/kubelet/pods/40fdaf60-ff95-49e7-9c00-622d211a969b/volumes" Dec 08 20:24:00 crc kubenswrapper[4781]: I1208 20:24:00.142326 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75ebf1a9-601e-4127-afcd-641216ae3a11" path="/var/lib/kubelet/pods/75ebf1a9-601e-4127-afcd-641216ae3a11/volumes" Dec 08 20:24:00 crc kubenswrapper[4781]: I1208 20:24:00.435199 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34bc4529-0766-4d9b-ad5c-5b78604ebb10","Type":"ContainerStarted","Data":"2e145e1de2dd3ba34b7aa77b1adeb4e8a2412fa4b01e564867cea4cd041a30ef"} Dec 08 20:24:00 crc kubenswrapper[4781]: I1208 20:24:00.437600 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86dcc84d-7tchk" event={"ID":"0c055aae-fa31-4d84-aa1b-b60c8829a61b","Type":"ContainerStarted","Data":"c7102c5f9b95799652a4bb652fa11cdc5f0afcaeb9f4630de40ceccd22db4715"} Dec 08 20:24:00 crc kubenswrapper[4781]: I1208 20:24:00.440399 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373","Type":"ContainerStarted","Data":"429298e82ab154c050eff815298b372a178b8e6a29f6a4b2ae8880707dd35c3f"} Dec 08 20:24:00 crc kubenswrapper[4781]: I1208 20:24:00.443079 4781 generic.go:334] "Generic (PLEG): container finished" podID="f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" containerID="0c7e062cfccabee5d125366983c35cdea59eeff9f35f8f3ee8de058481f5b85a" exitCode=0 Dec 08 20:24:00 crc kubenswrapper[4781]: I1208 20:24:00.443226 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" event={"ID":"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a","Type":"ContainerDied","Data":"0c7e062cfccabee5d125366983c35cdea59eeff9f35f8f3ee8de058481f5b85a"} Dec 08 20:24:00 crc kubenswrapper[4781]: I1208 20:24:00.443305 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" event={"ID":"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a","Type":"ContainerStarted","Data":"ae0e8ed4e5cae824a29a9c50f7837d8a609730018a9435e89ed637ab8f774348"} Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.067201 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.237299 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b4c587749-kmfjj"] Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.239943 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.243340 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.243595 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.268260 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b4c587749-kmfjj"] Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.383884 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-internal-tls-certs\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.383966 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-ovndb-tls-certs\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.383996 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-config\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.384029 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62t5h\" (UniqueName: \"kubernetes.io/projected/b3890267-5a89-4612-89f0-1bb7ba0e1245-kube-api-access-62t5h\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.384056 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-combined-ca-bundle\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.384081 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-public-tls-certs\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.384110 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-httpd-config\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.456077 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86dcc84d-7tchk" event={"ID":"0c055aae-fa31-4d84-aa1b-b60c8829a61b","Type":"ContainerStarted","Data":"86263da5d3bd8712aa4874eedc1af5f784bac940470b3ce2c7cca3b984caed17"} Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.456132 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86dcc84d-7tchk" event={"ID":"0c055aae-fa31-4d84-aa1b-b60c8829a61b","Type":"ContainerStarted","Data":"7e363bf8d440b983fb1557f7e492872cdbde8d90bac4e9ac82381f3137f60aa8"} Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.456169 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.460465 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373","Type":"ContainerStarted","Data":"a4465a3460b996b53f4540724ce4aa04411dca5181d9568b333dadcf9166e0af"} Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.462719 4781 generic.go:334] "Generic (PLEG): container finished" podID="c4245549-33f8-4a0e-a17e-08417bed869c" containerID="14fa401579550a4ae87b42d8db67cadeb1144d67d296365a254283ce4a0dc75f" exitCode=0 Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.462808 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bnfg8" event={"ID":"c4245549-33f8-4a0e-a17e-08417bed869c","Type":"ContainerDied","Data":"14fa401579550a4ae87b42d8db67cadeb1144d67d296365a254283ce4a0dc75f"} Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.465830 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" event={"ID":"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a","Type":"ContainerStarted","Data":"6f09bcd45ea314fbb421767e43b183648dd8bf31e41ffae61eb661b526d1b414"} Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.466040 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.469664 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34bc4529-0766-4d9b-ad5c-5b78604ebb10","Type":"ContainerStarted","Data":"1899ec1b6723e30dbfcfaa08dde9195c7ec8ebade14dc87e0f6c6f4be7dbaa10"} Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.475814 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-86dcc84d-7tchk" podStartSLOduration=3.475794737 podStartE2EDuration="3.475794737s" podCreationTimestamp="2025-12-08 20:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:01.474651574 +0000 UTC m=+1157.625934951" watchObservedRunningTime="2025-12-08 20:24:01.475794737 +0000 UTC m=+1157.627078114" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.485397 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-internal-tls-certs\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.485483 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-ovndb-tls-certs\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.485518 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-config\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.485554 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62t5h\" (UniqueName: \"kubernetes.io/projected/b3890267-5a89-4612-89f0-1bb7ba0e1245-kube-api-access-62t5h\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.485581 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-combined-ca-bundle\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.485608 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-public-tls-certs\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.485650 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-httpd-config\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.491897 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-internal-tls-certs\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.494316 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-config\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.495918 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-public-tls-certs\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.496321 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-httpd-config\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.507600 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-combined-ca-bundle\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.510437 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3890267-5a89-4612-89f0-1bb7ba0e1245-ovndb-tls-certs\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.510756 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.51073408 podStartE2EDuration="7.51073408s" podCreationTimestamp="2025-12-08 20:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:01.509211007 +0000 UTC m=+1157.660494384" watchObservedRunningTime="2025-12-08 20:24:01.51073408 +0000 UTC m=+1157.662017457" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.539887 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62t5h\" (UniqueName: \"kubernetes.io/projected/b3890267-5a89-4612-89f0-1bb7ba0e1245-kube-api-access-62t5h\") pod \"neutron-b4c587749-kmfjj\" (UID: \"b3890267-5a89-4612-89f0-1bb7ba0e1245\") " pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.573426 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.618600 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=16.618579338 podStartE2EDuration="16.618579338s" podCreationTimestamp="2025-12-08 20:23:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:01.594302161 +0000 UTC m=+1157.745585538" watchObservedRunningTime="2025-12-08 20:24:01.618579338 +0000 UTC m=+1157.769862715" Dec 08 20:24:01 crc kubenswrapper[4781]: I1208 20:24:01.630790 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" podStartSLOduration=3.630770738 podStartE2EDuration="3.630770738s" podCreationTimestamp="2025-12-08 20:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:01.629709497 +0000 UTC m=+1157.780992874" watchObservedRunningTime="2025-12-08 20:24:01.630770738 +0000 UTC m=+1157.782054115" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.144641 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bnfg8" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.339375 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-config-data\") pod \"c4245549-33f8-4a0e-a17e-08417bed869c\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.339719 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4245549-33f8-4a0e-a17e-08417bed869c-logs\") pod \"c4245549-33f8-4a0e-a17e-08417bed869c\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.339821 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-scripts\") pod \"c4245549-33f8-4a0e-a17e-08417bed869c\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.339956 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjpht\" (UniqueName: \"kubernetes.io/projected/c4245549-33f8-4a0e-a17e-08417bed869c-kube-api-access-zjpht\") pod \"c4245549-33f8-4a0e-a17e-08417bed869c\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.340022 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-combined-ca-bundle\") pod \"c4245549-33f8-4a0e-a17e-08417bed869c\" (UID: \"c4245549-33f8-4a0e-a17e-08417bed869c\") " Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.340680 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4245549-33f8-4a0e-a17e-08417bed869c-logs" (OuterVolumeSpecName: "logs") pod "c4245549-33f8-4a0e-a17e-08417bed869c" (UID: "c4245549-33f8-4a0e-a17e-08417bed869c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.347272 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-scripts" (OuterVolumeSpecName: "scripts") pod "c4245549-33f8-4a0e-a17e-08417bed869c" (UID: "c4245549-33f8-4a0e-a17e-08417bed869c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.348347 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4245549-33f8-4a0e-a17e-08417bed869c-kube-api-access-zjpht" (OuterVolumeSpecName: "kube-api-access-zjpht") pod "c4245549-33f8-4a0e-a17e-08417bed869c" (UID: "c4245549-33f8-4a0e-a17e-08417bed869c"). InnerVolumeSpecName "kube-api-access-zjpht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.371339 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c4245549-33f8-4a0e-a17e-08417bed869c" (UID: "c4245549-33f8-4a0e-a17e-08417bed869c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.377052 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-config-data" (OuterVolumeSpecName: "config-data") pod "c4245549-33f8-4a0e-a17e-08417bed869c" (UID: "c4245549-33f8-4a0e-a17e-08417bed869c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.441750 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.441787 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4245549-33f8-4a0e-a17e-08417bed869c-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.441795 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.441803 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjpht\" (UniqueName: \"kubernetes.io/projected/c4245549-33f8-4a0e-a17e-08417bed869c-kube-api-access-zjpht\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.441813 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4245549-33f8-4a0e-a17e-08417bed869c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.501780 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-bnfg8" event={"ID":"c4245549-33f8-4a0e-a17e-08417bed869c","Type":"ContainerDied","Data":"37fa6cbb207827f3301373eaa82b6522efe52d4b8600467412da9bd8686099f6"} Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.501825 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37fa6cbb207827f3301373eaa82b6522efe52d4b8600467412da9bd8686099f6" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.501894 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-bnfg8" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.507410 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da20c49c-5d8d-4701-a54b-c6ae7b6670db","Type":"ContainerStarted","Data":"974e260e1570724d32fa3dcd72d912b973dc9aa1dd8af3ec6d69a485f2f8ef7d"} Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.649615 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b4c587749-kmfjj"] Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.682995 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.683140 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.732588 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 08 20:24:04 crc kubenswrapper[4781]: I1208 20:24:04.735444 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.254060 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7cb6b5bcfd-8jc9n"] Dec 08 20:24:05 crc kubenswrapper[4781]: E1208 20:24:05.254708 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4245549-33f8-4a0e-a17e-08417bed869c" containerName="placement-db-sync" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.254726 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4245549-33f8-4a0e-a17e-08417bed869c" containerName="placement-db-sync" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.254888 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4245549-33f8-4a0e-a17e-08417bed869c" containerName="placement-db-sync" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.255802 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.261551 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.262070 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.262357 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.264309 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.262253 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sthd5" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.269567 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cb6b5bcfd-8jc9n"] Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.276829 4781 scope.go:117] "RemoveContainer" containerID="88afe53c5a8eee7da8cda766b95b369c181a7607767c4baa97fc70c479a2aec2" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.327505 4781 scope.go:117] "RemoveContainer" containerID="fe81e6f5b9fddc74059cce1c62849a2c0cb01b6a4335f3d5d062def56465b5db" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.378109 4781 scope.go:117] "RemoveContainer" containerID="414c39e5d7b68bf7e81a1bdd0d2f019be2b4837b80537b7895c52716c602ae8e" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.435073 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-scripts\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.435138 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t67c\" (UniqueName: \"kubernetes.io/projected/684ebfa2-2a23-4f1f-96cc-d436e63feede-kube-api-access-7t67c\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.435174 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-public-tls-certs\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.435214 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-config-data\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.435262 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-combined-ca-bundle\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.435284 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-internal-tls-certs\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.435301 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/684ebfa2-2a23-4f1f-96cc-d436e63feede-logs\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.527149 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.527190 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.534292 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4c587749-kmfjj" event={"ID":"b3890267-5a89-4612-89f0-1bb7ba0e1245","Type":"ContainerStarted","Data":"71a3f4fb082017244879cc45e4fd5f7fabab9d8c88442b34473e35be5c010efe"} Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.534335 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4c587749-kmfjj" event={"ID":"b3890267-5a89-4612-89f0-1bb7ba0e1245","Type":"ContainerStarted","Data":"950ef32d37c05937d1749e3db42113490691bbdecf6781b20f8f22e28debfc97"} Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.534349 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4c587749-kmfjj" event={"ID":"b3890267-5a89-4612-89f0-1bb7ba0e1245","Type":"ContainerStarted","Data":"3f7de904c0281ac89ba41abb9edfdc1cb692630f5ccb84beda63b72350a0118a"} Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.535328 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.536723 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-config-data\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.538320 4781 generic.go:334] "Generic (PLEG): container finished" podID="ec5cb8d0-f612-43f8-a3c2-27953b92735b" containerID="f54fda88e581234a7d6da66f0aca5fed2f7c097886ea21329647c89526a4ae15" exitCode=0 Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.539556 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p8pdd" event={"ID":"ec5cb8d0-f612-43f8-a3c2-27953b92735b","Type":"ContainerDied","Data":"f54fda88e581234a7d6da66f0aca5fed2f7c097886ea21329647c89526a4ae15"} Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.539601 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.539794 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.540041 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-combined-ca-bundle\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.540213 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-internal-tls-certs\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.540243 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/684ebfa2-2a23-4f1f-96cc-d436e63feede-logs\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.540528 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-scripts\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.540598 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t67c\" (UniqueName: \"kubernetes.io/projected/684ebfa2-2a23-4f1f-96cc-d436e63feede-kube-api-access-7t67c\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.540730 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-public-tls-certs\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.542124 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/684ebfa2-2a23-4f1f-96cc-d436e63feede-logs\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.545815 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-scripts\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.546772 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-internal-tls-certs\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.548078 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-config-data\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.577926 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b4c587749-kmfjj" podStartSLOduration=4.577885007 podStartE2EDuration="4.577885007s" podCreationTimestamp="2025-12-08 20:24:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:05.556890344 +0000 UTC m=+1161.708173731" watchObservedRunningTime="2025-12-08 20:24:05.577885007 +0000 UTC m=+1161.729168384" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.586158 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-combined-ca-bundle\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.592680 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/684ebfa2-2a23-4f1f-96cc-d436e63feede-public-tls-certs\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.600733 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t67c\" (UniqueName: \"kubernetes.io/projected/684ebfa2-2a23-4f1f-96cc-d436e63feede-kube-api-access-7t67c\") pod \"placement-7cb6b5bcfd-8jc9n\" (UID: \"684ebfa2-2a23-4f1f-96cc-d436e63feede\") " pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.628951 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.646570 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 08 20:24:05 crc kubenswrapper[4781]: I1208 20:24:05.897499 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:06 crc kubenswrapper[4781]: I1208 20:24:06.483559 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7cb6b5bcfd-8jc9n"] Dec 08 20:24:06 crc kubenswrapper[4781]: W1208 20:24:06.502081 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod684ebfa2_2a23_4f1f_96cc_d436e63feede.slice/crio-ce266608e807198cd05700075019661cc8190c0ec22d018d05edf959cbc04f79 WatchSource:0}: Error finding container ce266608e807198cd05700075019661cc8190c0ec22d018d05edf959cbc04f79: Status 404 returned error can't find the container with id ce266608e807198cd05700075019661cc8190c0ec22d018d05edf959cbc04f79 Dec 08 20:24:06 crc kubenswrapper[4781]: I1208 20:24:06.548324 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cb6b5bcfd-8jc9n" event={"ID":"684ebfa2-2a23-4f1f-96cc-d436e63feede","Type":"ContainerStarted","Data":"ce266608e807198cd05700075019661cc8190c0ec22d018d05edf959cbc04f79"} Dec 08 20:24:06 crc kubenswrapper[4781]: I1208 20:24:06.555791 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 08 20:24:06 crc kubenswrapper[4781]: I1208 20:24:06.555818 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 08 20:24:06 crc kubenswrapper[4781]: I1208 20:24:06.975507 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.077537 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-config-data\") pod \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.077814 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-fernet-keys\") pod \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.077959 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-scripts\") pod \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.078187 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-combined-ca-bundle\") pod \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.078338 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-credential-keys\") pod \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.078485 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wth5g\" (UniqueName: \"kubernetes.io/projected/ec5cb8d0-f612-43f8-a3c2-27953b92735b-kube-api-access-wth5g\") pod \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\" (UID: \"ec5cb8d0-f612-43f8-a3c2-27953b92735b\") " Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.086036 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-scripts" (OuterVolumeSpecName: "scripts") pod "ec5cb8d0-f612-43f8-a3c2-27953b92735b" (UID: "ec5cb8d0-f612-43f8-a3c2-27953b92735b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.087906 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec5cb8d0-f612-43f8-a3c2-27953b92735b-kube-api-access-wth5g" (OuterVolumeSpecName: "kube-api-access-wth5g") pod "ec5cb8d0-f612-43f8-a3c2-27953b92735b" (UID: "ec5cb8d0-f612-43f8-a3c2-27953b92735b"). InnerVolumeSpecName "kube-api-access-wth5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.098138 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ec5cb8d0-f612-43f8-a3c2-27953b92735b" (UID: "ec5cb8d0-f612-43f8-a3c2-27953b92735b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.108296 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ec5cb8d0-f612-43f8-a3c2-27953b92735b" (UID: "ec5cb8d0-f612-43f8-a3c2-27953b92735b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.123550 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec5cb8d0-f612-43f8-a3c2-27953b92735b" (UID: "ec5cb8d0-f612-43f8-a3c2-27953b92735b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.173050 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-config-data" (OuterVolumeSpecName: "config-data") pod "ec5cb8d0-f612-43f8-a3c2-27953b92735b" (UID: "ec5cb8d0-f612-43f8-a3c2-27953b92735b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.188283 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.188637 4781 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.188651 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wth5g\" (UniqueName: \"kubernetes.io/projected/ec5cb8d0-f612-43f8-a3c2-27953b92735b-kube-api-access-wth5g\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.188667 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.188679 4781 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.188691 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec5cb8d0-f612-43f8-a3c2-27953b92735b-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.604871 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cb6b5bcfd-8jc9n" event={"ID":"684ebfa2-2a23-4f1f-96cc-d436e63feede","Type":"ContainerStarted","Data":"b3917d1d0350edd57c56bc26488a727398144b18fed52b63c4f577aca6dbd762"} Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.604935 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7cb6b5bcfd-8jc9n" event={"ID":"684ebfa2-2a23-4f1f-96cc-d436e63feede","Type":"ContainerStarted","Data":"1ae36f7c3d31b1d1b6d4bb373b239ba46df52c0bc8b22091a0b0649a1c2977b2"} Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.605055 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.618460 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p8pdd" event={"ID":"ec5cb8d0-f612-43f8-a3c2-27953b92735b","Type":"ContainerDied","Data":"ef47bdc5af9c22a73147321a01cafa901a4e9aa04de971ed9a0d6b7848136b06"} Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.618519 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef47bdc5af9c22a73147321a01cafa901a4e9aa04de971ed9a0d6b7848136b06" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.618593 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p8pdd" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.619393 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.619412 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.640491 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7cb6b5bcfd-8jc9n" podStartSLOduration=2.640474544 podStartE2EDuration="2.640474544s" podCreationTimestamp="2025-12-08 20:24:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:07.628090238 +0000 UTC m=+1163.779373605" watchObservedRunningTime="2025-12-08 20:24:07.640474544 +0000 UTC m=+1163.791757921" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.692721 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-796ff6864f-m6jf6"] Dec 08 20:24:07 crc kubenswrapper[4781]: E1208 20:24:07.693105 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec5cb8d0-f612-43f8-a3c2-27953b92735b" containerName="keystone-bootstrap" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.693121 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec5cb8d0-f612-43f8-a3c2-27953b92735b" containerName="keystone-bootstrap" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.693311 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec5cb8d0-f612-43f8-a3c2-27953b92735b" containerName="keystone-bootstrap" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.693874 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.697964 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.704470 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.704688 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4bstc" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.704733 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.705233 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.705459 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.734998 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-796ff6864f-m6jf6"] Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.800039 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-config-data\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.800133 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-scripts\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.800226 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-internal-tls-certs\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.800314 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-public-tls-certs\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.800421 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-credential-keys\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.800448 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-fernet-keys\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.800516 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-combined-ca-bundle\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.800666 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rfxn\" (UniqueName: \"kubernetes.io/projected/47dda6bf-1ed5-4df0-a647-32e518a7514f-kube-api-access-5rfxn\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.902110 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-config-data\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.902194 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-scripts\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.902235 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-internal-tls-certs\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.902273 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-public-tls-certs\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.902326 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-credential-keys\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.902356 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-fernet-keys\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.902396 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-combined-ca-bundle\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.902476 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rfxn\" (UniqueName: \"kubernetes.io/projected/47dda6bf-1ed5-4df0-a647-32e518a7514f-kube-api-access-5rfxn\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.908912 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-fernet-keys\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.909678 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-credential-keys\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.910500 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-config-data\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.911285 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-internal-tls-certs\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.911556 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-public-tls-certs\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.915364 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-scripts\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.921383 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47dda6bf-1ed5-4df0-a647-32e518a7514f-combined-ca-bundle\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:07 crc kubenswrapper[4781]: I1208 20:24:07.924485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rfxn\" (UniqueName: \"kubernetes.io/projected/47dda6bf-1ed5-4df0-a647-32e518a7514f-kube-api-access-5rfxn\") pod \"keystone-796ff6864f-m6jf6\" (UID: \"47dda6bf-1ed5-4df0-a647-32e518a7514f\") " pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:08 crc kubenswrapper[4781]: I1208 20:24:08.025792 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:08 crc kubenswrapper[4781]: I1208 20:24:08.258013 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 08 20:24:08 crc kubenswrapper[4781]: I1208 20:24:08.624106 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-796ff6864f-m6jf6"] Dec 08 20:24:08 crc kubenswrapper[4781]: I1208 20:24:08.630834 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 20:24:08 crc kubenswrapper[4781]: I1208 20:24:08.630851 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 20:24:08 crc kubenswrapper[4781]: I1208 20:24:08.631043 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 20:24:08 crc kubenswrapper[4781]: I1208 20:24:08.632168 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:08 crc kubenswrapper[4781]: I1208 20:24:08.916120 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:24:09 crc kubenswrapper[4781]: I1208 20:24:09.005056 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-lk5tr"] Dec 08 20:24:09 crc kubenswrapper[4781]: I1208 20:24:09.006785 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" podUID="8bf90300-5bf4-4ec5-ba4b-6145663748fa" containerName="dnsmasq-dns" containerID="cri-o://bf42e12ed62d1871d0c5d435c0b91e48e8ea830556985295a2dec250f5258ab3" gracePeriod=10 Dec 08 20:24:09 crc kubenswrapper[4781]: I1208 20:24:09.518766 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 08 20:24:09 crc kubenswrapper[4781]: I1208 20:24:09.652140 4781 generic.go:334] "Generic (PLEG): container finished" podID="8bf90300-5bf4-4ec5-ba4b-6145663748fa" containerID="bf42e12ed62d1871d0c5d435c0b91e48e8ea830556985295a2dec250f5258ab3" exitCode=0 Dec 08 20:24:09 crc kubenswrapper[4781]: I1208 20:24:09.652227 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" event={"ID":"8bf90300-5bf4-4ec5-ba4b-6145663748fa","Type":"ContainerDied","Data":"bf42e12ed62d1871d0c5d435c0b91e48e8ea830556985295a2dec250f5258ab3"} Dec 08 20:24:09 crc kubenswrapper[4781]: I1208 20:24:09.663060 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-796ff6864f-m6jf6" event={"ID":"47dda6bf-1ed5-4df0-a647-32e518a7514f","Type":"ContainerStarted","Data":"65eeaca82883270dfd9b231492209131b6c4198d06c9862e95c38abc16fa04cb"} Dec 08 20:24:09 crc kubenswrapper[4781]: I1208 20:24:09.663094 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-796ff6864f-m6jf6" event={"ID":"47dda6bf-1ed5-4df0-a647-32e518a7514f","Type":"ContainerStarted","Data":"5323a4c2fd2cf7ae9e60c884678e23b03c8e323413013cee4c6c71158974eccc"} Dec 08 20:24:09 crc kubenswrapper[4781]: I1208 20:24:09.663538 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:09 crc kubenswrapper[4781]: I1208 20:24:09.753169 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-796ff6864f-m6jf6" podStartSLOduration=2.753144889 podStartE2EDuration="2.753144889s" podCreationTimestamp="2025-12-08 20:24:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:09.7368238 +0000 UTC m=+1165.888107187" watchObservedRunningTime="2025-12-08 20:24:09.753144889 +0000 UTC m=+1165.904428266" Dec 08 20:24:09 crc kubenswrapper[4781]: I1208 20:24:09.932470 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66bd65f4cd-rdlc7" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 08 20:24:10 crc kubenswrapper[4781]: I1208 20:24:10.067076 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75dd9f77c4-85lhw" podUID="eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 08 20:24:10 crc kubenswrapper[4781]: I1208 20:24:10.142365 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 08 20:24:10 crc kubenswrapper[4781]: I1208 20:24:10.142491 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 20:24:10 crc kubenswrapper[4781]: I1208 20:24:10.531823 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 08 20:24:11 crc kubenswrapper[4781]: I1208 20:24:11.887778 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:24:11 crc kubenswrapper[4781]: I1208 20:24:11.994810 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-ovsdbserver-nb\") pod \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " Dec 08 20:24:11 crc kubenswrapper[4781]: I1208 20:24:11.994900 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-dns-swift-storage-0\") pod \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " Dec 08 20:24:11 crc kubenswrapper[4781]: I1208 20:24:11.994973 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-ovsdbserver-sb\") pod \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " Dec 08 20:24:11 crc kubenswrapper[4781]: I1208 20:24:11.995039 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-config\") pod \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " Dec 08 20:24:11 crc kubenswrapper[4781]: I1208 20:24:11.995106 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-dns-svc\") pod \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " Dec 08 20:24:11 crc kubenswrapper[4781]: I1208 20:24:11.995174 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlnch\" (UniqueName: \"kubernetes.io/projected/8bf90300-5bf4-4ec5-ba4b-6145663748fa-kube-api-access-tlnch\") pod \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\" (UID: \"8bf90300-5bf4-4ec5-ba4b-6145663748fa\") " Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.049080 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf90300-5bf4-4ec5-ba4b-6145663748fa-kube-api-access-tlnch" (OuterVolumeSpecName: "kube-api-access-tlnch") pod "8bf90300-5bf4-4ec5-ba4b-6145663748fa" (UID: "8bf90300-5bf4-4ec5-ba4b-6145663748fa"). InnerVolumeSpecName "kube-api-access-tlnch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.081198 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8bf90300-5bf4-4ec5-ba4b-6145663748fa" (UID: "8bf90300-5bf4-4ec5-ba4b-6145663748fa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.084498 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-config" (OuterVolumeSpecName: "config") pod "8bf90300-5bf4-4ec5-ba4b-6145663748fa" (UID: "8bf90300-5bf4-4ec5-ba4b-6145663748fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.086446 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8bf90300-5bf4-4ec5-ba4b-6145663748fa" (UID: "8bf90300-5bf4-4ec5-ba4b-6145663748fa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.088087 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8bf90300-5bf4-4ec5-ba4b-6145663748fa" (UID: "8bf90300-5bf4-4ec5-ba4b-6145663748fa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.096468 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8bf90300-5bf4-4ec5-ba4b-6145663748fa" (UID: "8bf90300-5bf4-4ec5-ba4b-6145663748fa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.101202 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.101241 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.101253 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.101268 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.101279 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8bf90300-5bf4-4ec5-ba4b-6145663748fa-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.101290 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlnch\" (UniqueName: \"kubernetes.io/projected/8bf90300-5bf4-4ec5-ba4b-6145663748fa-kube-api-access-tlnch\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.688419 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" event={"ID":"8bf90300-5bf4-4ec5-ba4b-6145663748fa","Type":"ContainerDied","Data":"637f931e83d20cff4432c8c36ea7f552ea6f30f23da92d69db600aaeda68e4f4"} Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.688480 4781 scope.go:117] "RemoveContainer" containerID="bf42e12ed62d1871d0c5d435c0b91e48e8ea830556985295a2dec250f5258ab3" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.688625 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.720697 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-lk5tr"] Dec 08 20:24:12 crc kubenswrapper[4781]: I1208 20:24:12.732575 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-lk5tr"] Dec 08 20:24:13 crc kubenswrapper[4781]: I1208 20:24:13.957195 4781 scope.go:117] "RemoveContainer" containerID="61e1d59585782f2e2c7e98ef23a5b5ae06610dfdd07a3441aa925806f81db1c0" Dec 08 20:24:14 crc kubenswrapper[4781]: I1208 20:24:14.154465 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bf90300-5bf4-4ec5-ba4b-6145663748fa" path="/var/lib/kubelet/pods/8bf90300-5bf4-4ec5-ba4b-6145663748fa/volumes" Dec 08 20:24:14 crc kubenswrapper[4781]: I1208 20:24:14.708452 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9wz9p" event={"ID":"279c45d3-ece4-42b6-8968-90806c171bf9","Type":"ContainerStarted","Data":"112b865222beb978823f687a69cf100b9eedd1374633ddc235bb6bff1f8f404e"} Dec 08 20:24:14 crc kubenswrapper[4781]: I1208 20:24:14.718644 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da20c49c-5d8d-4701-a54b-c6ae7b6670db","Type":"ContainerStarted","Data":"1a652fde42636178fb56e7d1e12ac960771f9ccc3be5d046923e413b14c4494d"} Dec 08 20:24:14 crc kubenswrapper[4781]: I1208 20:24:14.732340 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-9wz9p" podStartSLOduration=8.774481306 podStartE2EDuration="44.732320669s" podCreationTimestamp="2025-12-08 20:23:30 +0000 UTC" firstStartedPulling="2025-12-08 20:23:33.811988445 +0000 UTC m=+1129.963271832" lastFinishedPulling="2025-12-08 20:24:09.769827818 +0000 UTC m=+1165.921111195" observedRunningTime="2025-12-08 20:24:14.723105755 +0000 UTC m=+1170.874389142" watchObservedRunningTime="2025-12-08 20:24:14.732320669 +0000 UTC m=+1170.883604056" Dec 08 20:24:15 crc kubenswrapper[4781]: I1208 20:24:15.743039 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-75vwt" event={"ID":"499d0466-ecbe-4866-b516-3c778c16ec94","Type":"ContainerStarted","Data":"57bc8a3faf644b94b9e9fb2842e5d3b0e86071d2c94bf52cd4d1f16eeec5a217"} Dec 08 20:24:15 crc kubenswrapper[4781]: I1208 20:24:15.770198 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-75vwt" podStartSLOduration=4.912920966 podStartE2EDuration="45.77017784s" podCreationTimestamp="2025-12-08 20:23:30 +0000 UTC" firstStartedPulling="2025-12-08 20:23:33.20036426 +0000 UTC m=+1129.351647637" lastFinishedPulling="2025-12-08 20:24:14.057621134 +0000 UTC m=+1170.208904511" observedRunningTime="2025-12-08 20:24:15.762381346 +0000 UTC m=+1171.913664733" watchObservedRunningTime="2025-12-08 20:24:15.77017784 +0000 UTC m=+1171.921461217" Dec 08 20:24:16 crc kubenswrapper[4781]: I1208 20:24:16.233945 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74fd8b655f-lk5tr" podUID="8bf90300-5bf4-4ec5-ba4b-6145663748fa" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: i/o timeout" Dec 08 20:24:17 crc kubenswrapper[4781]: I1208 20:24:17.763197 4781 generic.go:334] "Generic (PLEG): container finished" podID="279c45d3-ece4-42b6-8968-90806c171bf9" containerID="112b865222beb978823f687a69cf100b9eedd1374633ddc235bb6bff1f8f404e" exitCode=0 Dec 08 20:24:17 crc kubenswrapper[4781]: I1208 20:24:17.763475 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9wz9p" event={"ID":"279c45d3-ece4-42b6-8968-90806c171bf9","Type":"ContainerDied","Data":"112b865222beb978823f687a69cf100b9eedd1374633ddc235bb6bff1f8f404e"} Dec 08 20:24:19 crc kubenswrapper[4781]: I1208 20:24:19.796756 4781 generic.go:334] "Generic (PLEG): container finished" podID="499d0466-ecbe-4866-b516-3c778c16ec94" containerID="57bc8a3faf644b94b9e9fb2842e5d3b0e86071d2c94bf52cd4d1f16eeec5a217" exitCode=0 Dec 08 20:24:19 crc kubenswrapper[4781]: I1208 20:24:19.796854 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-75vwt" event={"ID":"499d0466-ecbe-4866-b516-3c778c16ec94","Type":"ContainerDied","Data":"57bc8a3faf644b94b9e9fb2842e5d3b0e86071d2c94bf52cd4d1f16eeec5a217"} Dec 08 20:24:19 crc kubenswrapper[4781]: I1208 20:24:19.925887 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66bd65f4cd-rdlc7" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 08 20:24:20 crc kubenswrapper[4781]: I1208 20:24:20.040995 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75dd9f77c4-85lhw" podUID="eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 08 20:24:21 crc kubenswrapper[4781]: I1208 20:24:21.797261 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9wz9p" Dec 08 20:24:21 crc kubenswrapper[4781]: I1208 20:24:21.831536 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-75vwt" event={"ID":"499d0466-ecbe-4866-b516-3c778c16ec94","Type":"ContainerDied","Data":"f16d2c47262058cb18175d4dab6fbb4dd2f2cb942ba60bec0233849a27bbed4a"} Dec 08 20:24:21 crc kubenswrapper[4781]: I1208 20:24:21.831581 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f16d2c47262058cb18175d4dab6fbb4dd2f2cb942ba60bec0233849a27bbed4a" Dec 08 20:24:21 crc kubenswrapper[4781]: I1208 20:24:21.834356 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9wz9p" event={"ID":"279c45d3-ece4-42b6-8968-90806c171bf9","Type":"ContainerDied","Data":"9fa7d917a5f507ce4d588bb23b23ce0e2b3b22ff604c02a56f29536150a355db"} Dec 08 20:24:21 crc kubenswrapper[4781]: I1208 20:24:21.834393 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fa7d917a5f507ce4d588bb23b23ce0e2b3b22ff604c02a56f29536150a355db" Dec 08 20:24:21 crc kubenswrapper[4781]: I1208 20:24:21.834449 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9wz9p" Dec 08 20:24:21 crc kubenswrapper[4781]: I1208 20:24:21.920151 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/279c45d3-ece4-42b6-8968-90806c171bf9-db-sync-config-data\") pod \"279c45d3-ece4-42b6-8968-90806c171bf9\" (UID: \"279c45d3-ece4-42b6-8968-90806c171bf9\") " Dec 08 20:24:21 crc kubenswrapper[4781]: I1208 20:24:21.921222 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279c45d3-ece4-42b6-8968-90806c171bf9-combined-ca-bundle\") pod \"279c45d3-ece4-42b6-8968-90806c171bf9\" (UID: \"279c45d3-ece4-42b6-8968-90806c171bf9\") " Dec 08 20:24:21 crc kubenswrapper[4781]: I1208 20:24:21.921376 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjgtj\" (UniqueName: \"kubernetes.io/projected/279c45d3-ece4-42b6-8968-90806c171bf9-kube-api-access-tjgtj\") pod \"279c45d3-ece4-42b6-8968-90806c171bf9\" (UID: \"279c45d3-ece4-42b6-8968-90806c171bf9\") " Dec 08 20:24:21 crc kubenswrapper[4781]: I1208 20:24:21.924630 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/279c45d3-ece4-42b6-8968-90806c171bf9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "279c45d3-ece4-42b6-8968-90806c171bf9" (UID: "279c45d3-ece4-42b6-8968-90806c171bf9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:21 crc kubenswrapper[4781]: I1208 20:24:21.928816 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/279c45d3-ece4-42b6-8968-90806c171bf9-kube-api-access-tjgtj" (OuterVolumeSpecName: "kube-api-access-tjgtj") pod "279c45d3-ece4-42b6-8968-90806c171bf9" (UID: "279c45d3-ece4-42b6-8968-90806c171bf9"). InnerVolumeSpecName "kube-api-access-tjgtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:21 crc kubenswrapper[4781]: I1208 20:24:21.954045 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/279c45d3-ece4-42b6-8968-90806c171bf9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "279c45d3-ece4-42b6-8968-90806c171bf9" (UID: "279c45d3-ece4-42b6-8968-90806c171bf9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:21 crc kubenswrapper[4781]: I1208 20:24:21.954206 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-75vwt" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.025477 4781 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/279c45d3-ece4-42b6-8968-90806c171bf9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.025512 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/279c45d3-ece4-42b6-8968-90806c171bf9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.025525 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjgtj\" (UniqueName: \"kubernetes.io/projected/279c45d3-ece4-42b6-8968-90806c171bf9-kube-api-access-tjgtj\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:22 crc kubenswrapper[4781]: E1208 20:24:22.069454 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.126170 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-scripts\") pod \"499d0466-ecbe-4866-b516-3c778c16ec94\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.126236 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-config-data\") pod \"499d0466-ecbe-4866-b516-3c778c16ec94\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.126269 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-combined-ca-bundle\") pod \"499d0466-ecbe-4866-b516-3c778c16ec94\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.126447 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/499d0466-ecbe-4866-b516-3c778c16ec94-etc-machine-id\") pod \"499d0466-ecbe-4866-b516-3c778c16ec94\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.126510 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-db-sync-config-data\") pod \"499d0466-ecbe-4866-b516-3c778c16ec94\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.126560 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q8km\" (UniqueName: \"kubernetes.io/projected/499d0466-ecbe-4866-b516-3c778c16ec94-kube-api-access-5q8km\") pod \"499d0466-ecbe-4866-b516-3c778c16ec94\" (UID: \"499d0466-ecbe-4866-b516-3c778c16ec94\") " Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.126683 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/499d0466-ecbe-4866-b516-3c778c16ec94-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "499d0466-ecbe-4866-b516-3c778c16ec94" (UID: "499d0466-ecbe-4866-b516-3c778c16ec94"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.126990 4781 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/499d0466-ecbe-4866-b516-3c778c16ec94-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.137297 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/499d0466-ecbe-4866-b516-3c778c16ec94-kube-api-access-5q8km" (OuterVolumeSpecName: "kube-api-access-5q8km") pod "499d0466-ecbe-4866-b516-3c778c16ec94" (UID: "499d0466-ecbe-4866-b516-3c778c16ec94"). InnerVolumeSpecName "kube-api-access-5q8km". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.138079 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "499d0466-ecbe-4866-b516-3c778c16ec94" (UID: "499d0466-ecbe-4866-b516-3c778c16ec94"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.138203 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-scripts" (OuterVolumeSpecName: "scripts") pod "499d0466-ecbe-4866-b516-3c778c16ec94" (UID: "499d0466-ecbe-4866-b516-3c778c16ec94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.188746 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "499d0466-ecbe-4866-b516-3c778c16ec94" (UID: "499d0466-ecbe-4866-b516-3c778c16ec94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.225009 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-config-data" (OuterVolumeSpecName: "config-data") pod "499d0466-ecbe-4866-b516-3c778c16ec94" (UID: "499d0466-ecbe-4866-b516-3c778c16ec94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.229769 4781 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.229833 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q8km\" (UniqueName: \"kubernetes.io/projected/499d0466-ecbe-4866-b516-3c778c16ec94-kube-api-access-5q8km\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.229852 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.229864 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.229875 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499d0466-ecbe-4866-b516-3c778c16ec94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.848300 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-75vwt" Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.849390 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerName="proxy-httpd" containerID="cri-o://14e474036c862f125743f77505bf07ce0937ce075b267ea59eaa45a591286fe1" gracePeriod=30 Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.849557 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerName="sg-core" containerID="cri-o://1a652fde42636178fb56e7d1e12ac960771f9ccc3be5d046923e413b14c4494d" gracePeriod=30 Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.848464 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerName="ceilometer-notification-agent" containerID="cri-o://974e260e1570724d32fa3dcd72d912b973dc9aa1dd8af3ec6d69a485f2f8ef7d" gracePeriod=30 Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.848297 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da20c49c-5d8d-4701-a54b-c6ae7b6670db","Type":"ContainerStarted","Data":"14e474036c862f125743f77505bf07ce0937ce075b267ea59eaa45a591286fe1"} Dec 08 20:24:22 crc kubenswrapper[4781]: I1208 20:24:22.849792 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.077968 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-676d96fbc7-6xfzf"] Dec 08 20:24:23 crc kubenswrapper[4781]: E1208 20:24:23.078461 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf90300-5bf4-4ec5-ba4b-6145663748fa" containerName="dnsmasq-dns" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.078479 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf90300-5bf4-4ec5-ba4b-6145663748fa" containerName="dnsmasq-dns" Dec 08 20:24:23 crc kubenswrapper[4781]: E1208 20:24:23.078497 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="499d0466-ecbe-4866-b516-3c778c16ec94" containerName="cinder-db-sync" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.078505 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="499d0466-ecbe-4866-b516-3c778c16ec94" containerName="cinder-db-sync" Dec 08 20:24:23 crc kubenswrapper[4781]: E1208 20:24:23.078531 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf90300-5bf4-4ec5-ba4b-6145663748fa" containerName="init" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.078539 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf90300-5bf4-4ec5-ba4b-6145663748fa" containerName="init" Dec 08 20:24:23 crc kubenswrapper[4781]: E1208 20:24:23.078559 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279c45d3-ece4-42b6-8968-90806c171bf9" containerName="barbican-db-sync" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.078566 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="279c45d3-ece4-42b6-8968-90806c171bf9" containerName="barbican-db-sync" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.078797 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="499d0466-ecbe-4866-b516-3c778c16ec94" containerName="cinder-db-sync" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.078832 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="279c45d3-ece4-42b6-8968-90806c171bf9" containerName="barbican-db-sync" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.078846 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf90300-5bf4-4ec5-ba4b-6145663748fa" containerName="dnsmasq-dns" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.079961 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.083403 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.083690 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.086688 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4f784" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.102528 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-676d96fbc7-6xfzf"] Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.180289 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6d674cb658-5bj7r"] Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.181639 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.190275 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.200764 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65dd957765-st75q"] Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.202636 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.245067 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6d674cb658-5bj7r"] Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.249269 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/558023e1-d94c-4422-a958-796ba9bf387f-logs\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.249327 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/558023e1-d94c-4422-a958-796ba9bf387f-config-data\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.249430 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qjdt\" (UniqueName: \"kubernetes.io/projected/558023e1-d94c-4422-a958-796ba9bf387f-kube-api-access-9qjdt\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.249498 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/558023e1-d94c-4422-a958-796ba9bf387f-combined-ca-bundle\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.249768 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/558023e1-d94c-4422-a958-796ba9bf387f-config-data-custom\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.266022 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-st75q"] Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.381587 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcaa98ab-e000-41ef-bf28-189680138c66-config-data-custom\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.381709 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b269l\" (UniqueName: \"kubernetes.io/projected/a13c65f3-ffed-4345-a182-05e8c949d6a3-kube-api-access-b269l\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.381751 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-ovsdbserver-nb\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.381841 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-dns-swift-storage-0\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.381880 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/558023e1-d94c-4422-a958-796ba9bf387f-config-data-custom\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.382095 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/558023e1-d94c-4422-a958-796ba9bf387f-logs\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.382157 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/558023e1-d94c-4422-a958-796ba9bf387f-config-data\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.382188 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-config\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.382243 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcaa98ab-e000-41ef-bf28-189680138c66-combined-ca-bundle\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.382296 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qjdt\" (UniqueName: \"kubernetes.io/projected/558023e1-d94c-4422-a958-796ba9bf387f-kube-api-access-9qjdt\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.382464 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/558023e1-d94c-4422-a958-796ba9bf387f-combined-ca-bundle\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.382546 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-dns-svc\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.382576 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-ovsdbserver-sb\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.382662 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcaa98ab-e000-41ef-bf28-189680138c66-config-data\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.382732 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxrcv\" (UniqueName: \"kubernetes.io/projected/dcaa98ab-e000-41ef-bf28-189680138c66-kube-api-access-hxrcv\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.382756 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcaa98ab-e000-41ef-bf28-189680138c66-logs\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.386166 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/558023e1-d94c-4422-a958-796ba9bf387f-logs\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.390475 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.395857 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/558023e1-d94c-4422-a958-796ba9bf387f-config-data-custom\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.396605 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/558023e1-d94c-4422-a958-796ba9bf387f-combined-ca-bundle\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.407247 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/558023e1-d94c-4422-a958-796ba9bf387f-config-data\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.414796 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qjdt\" (UniqueName: \"kubernetes.io/projected/558023e1-d94c-4422-a958-796ba9bf387f-kube-api-access-9qjdt\") pod \"barbican-worker-676d96fbc7-6xfzf\" (UID: \"558023e1-d94c-4422-a958-796ba9bf387f\") " pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.426861 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-676d96fbc7-6xfzf" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.430825 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.430983 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.450261 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.450472 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nmjwb" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.450557 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.450648 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.475166 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-55d57ffb96-9nv59"] Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.479762 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.485477 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.486873 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.487100 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knh2j\" (UniqueName: \"kubernetes.io/projected/ec3cd201-8baf-495f-aef9-91043bdfb8cb-kube-api-access-knh2j\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.487210 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcaa98ab-e000-41ef-bf28-189680138c66-config-data-custom\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.487330 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b269l\" (UniqueName: \"kubernetes.io/projected/a13c65f3-ffed-4345-a182-05e8c949d6a3-kube-api-access-b269l\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.487439 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-ovsdbserver-nb\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.487541 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-config-data\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.487637 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-dns-swift-storage-0\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.487762 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.488112 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-config\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.488203 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcaa98ab-e000-41ef-bf28-189680138c66-combined-ca-bundle\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.488365 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-dns-svc\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.488454 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-ovsdbserver-sb\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.488569 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcaa98ab-e000-41ef-bf28-189680138c66-config-data\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.488656 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-scripts\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.488758 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxrcv\" (UniqueName: \"kubernetes.io/projected/dcaa98ab-e000-41ef-bf28-189680138c66-kube-api-access-hxrcv\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.488854 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcaa98ab-e000-41ef-bf28-189680138c66-logs\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.494283 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec3cd201-8baf-495f-aef9-91043bdfb8cb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.491814 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-ovsdbserver-sb\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.492343 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-dns-svc\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.495393 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dcaa98ab-e000-41ef-bf28-189680138c66-logs\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.497662 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-config\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.498370 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-dns-swift-storage-0\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.498391 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-st75q"] Dec 08 20:24:23 crc kubenswrapper[4781]: E1208 20:24:23.504230 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-b269l ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-65dd957765-st75q" podUID="a13c65f3-ffed-4345-a182-05e8c949d6a3" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.491330 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-ovsdbserver-nb\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.511400 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcaa98ab-e000-41ef-bf28-189680138c66-combined-ca-bundle\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.515979 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55d57ffb96-9nv59"] Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.526917 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dcaa98ab-e000-41ef-bf28-189680138c66-config-data-custom\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.530175 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b269l\" (UniqueName: \"kubernetes.io/projected/a13c65f3-ffed-4345-a182-05e8c949d6a3-kube-api-access-b269l\") pod \"dnsmasq-dns-65dd957765-st75q\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.541319 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcaa98ab-e000-41ef-bf28-189680138c66-config-data\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.574774 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxrcv\" (UniqueName: \"kubernetes.io/projected/dcaa98ab-e000-41ef-bf28-189680138c66-kube-api-access-hxrcv\") pod \"barbican-keystone-listener-6d674cb658-5bj7r\" (UID: \"dcaa98ab-e000-41ef-bf28-189680138c66\") " pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.596247 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.596363 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-config-data\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.596433 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-combined-ca-bundle\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.596523 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrfhg\" (UniqueName: \"kubernetes.io/projected/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-kube-api-access-hrfhg\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.596556 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-scripts\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.596587 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec3cd201-8baf-495f-aef9-91043bdfb8cb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.596654 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.596681 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-logs\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.596721 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knh2j\" (UniqueName: \"kubernetes.io/projected/ec3cd201-8baf-495f-aef9-91043bdfb8cb-kube-api-access-knh2j\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.596763 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-config-data-custom\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.596823 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-config-data\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.601019 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec3cd201-8baf-495f-aef9-91043bdfb8cb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.604598 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-scripts\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.606057 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.607360 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-config-data\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.612252 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.620237 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knh2j\" (UniqueName: \"kubernetes.io/projected/ec3cd201-8baf-495f-aef9-91043bdfb8cb-kube-api-access-knh2j\") pod \"cinder-scheduler-0\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.622764 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-dtltj"] Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.625960 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.638236 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-dtltj"] Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.658042 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.660004 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.662633 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.673234 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.698558 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6q4w\" (UniqueName: \"kubernetes.io/projected/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-kube-api-access-v6q4w\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.698636 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-logs\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.698664 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-config-data-custom\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.698685 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.698721 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-config-data-custom\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.698741 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-ovsdbserver-nb\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.698967 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-ovsdbserver-sb\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.699007 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.699071 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-config\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.699098 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-logs\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.699119 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-dns-swift-storage-0\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.699136 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-config-data\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.699152 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-scripts\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.699176 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-logs\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.699189 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-config-data\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.699261 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzrq6\" (UniqueName: \"kubernetes.io/projected/20fec9e5-0d25-4c44-b756-b163add48592-kube-api-access-dzrq6\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.699329 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-combined-ca-bundle\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.699368 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrfhg\" (UniqueName: \"kubernetes.io/projected/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-kube-api-access-hrfhg\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.699390 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-dns-svc\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.706836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-combined-ca-bundle\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.707699 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-config-data\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.710850 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-config-data-custom\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.719945 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrfhg\" (UniqueName: \"kubernetes.io/projected/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-kube-api-access-hrfhg\") pod \"barbican-api-55d57ffb96-9nv59\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.801733 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzrq6\" (UniqueName: \"kubernetes.io/projected/20fec9e5-0d25-4c44-b756-b163add48592-kube-api-access-dzrq6\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.801793 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-dns-svc\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.801817 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6q4w\" (UniqueName: \"kubernetes.io/projected/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-kube-api-access-v6q4w\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.801847 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-config-data-custom\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.801863 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.801893 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-ovsdbserver-nb\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.801928 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-ovsdbserver-sb\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.801949 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.801994 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-config\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.802016 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-logs\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.802036 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-dns-swift-storage-0\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.802052 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-config-data\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.802068 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-scripts\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.803315 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-ovsdbserver-nb\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.804243 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-dns-svc\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.808518 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-ovsdbserver-sb\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.808597 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.809251 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-config\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.809555 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-logs\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.810198 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-dns-swift-storage-0\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.812263 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-scripts\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.812698 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.820092 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.821692 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-config-data-custom\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.821828 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-config-data\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.825622 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6q4w\" (UniqueName: \"kubernetes.io/projected/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-kube-api-access-v6q4w\") pod \"cinder-api-0\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " pod="openstack/cinder-api-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.826797 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzrq6\" (UniqueName: \"kubernetes.io/projected/20fec9e5-0d25-4c44-b756-b163add48592-kube-api-access-dzrq6\") pod \"dnsmasq-dns-5c77d8b67c-dtltj\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.869406 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.876046 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.894167 4781 generic.go:334] "Generic (PLEG): container finished" podID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerID="14e474036c862f125743f77505bf07ce0937ce075b267ea59eaa45a591286fe1" exitCode=0 Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.894208 4781 generic.go:334] "Generic (PLEG): container finished" podID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerID="1a652fde42636178fb56e7d1e12ac960771f9ccc3be5d046923e413b14c4494d" exitCode=2 Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.894264 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.894320 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da20c49c-5d8d-4701-a54b-c6ae7b6670db","Type":"ContainerDied","Data":"14e474036c862f125743f77505bf07ce0937ce075b267ea59eaa45a591286fe1"} Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.894380 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da20c49c-5d8d-4701-a54b-c6ae7b6670db","Type":"ContainerDied","Data":"1a652fde42636178fb56e7d1e12ac960771f9ccc3be5d046923e413b14c4494d"} Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.949650 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.953346 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:23 crc kubenswrapper[4781]: I1208 20:24:23.979207 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.004253 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-dns-svc\") pod \"a13c65f3-ffed-4345-a182-05e8c949d6a3\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.004302 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b269l\" (UniqueName: \"kubernetes.io/projected/a13c65f3-ffed-4345-a182-05e8c949d6a3-kube-api-access-b269l\") pod \"a13c65f3-ffed-4345-a182-05e8c949d6a3\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.004339 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-ovsdbserver-sb\") pod \"a13c65f3-ffed-4345-a182-05e8c949d6a3\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.004515 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-dns-swift-storage-0\") pod \"a13c65f3-ffed-4345-a182-05e8c949d6a3\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.004538 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-config\") pod \"a13c65f3-ffed-4345-a182-05e8c949d6a3\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.004588 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-ovsdbserver-nb\") pod \"a13c65f3-ffed-4345-a182-05e8c949d6a3\" (UID: \"a13c65f3-ffed-4345-a182-05e8c949d6a3\") " Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.005262 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a13c65f3-ffed-4345-a182-05e8c949d6a3" (UID: "a13c65f3-ffed-4345-a182-05e8c949d6a3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.005284 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a13c65f3-ffed-4345-a182-05e8c949d6a3" (UID: "a13c65f3-ffed-4345-a182-05e8c949d6a3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.005643 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a13c65f3-ffed-4345-a182-05e8c949d6a3" (UID: "a13c65f3-ffed-4345-a182-05e8c949d6a3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.005845 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a13c65f3-ffed-4345-a182-05e8c949d6a3" (UID: "a13c65f3-ffed-4345-a182-05e8c949d6a3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.005968 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-config" (OuterVolumeSpecName: "config") pod "a13c65f3-ffed-4345-a182-05e8c949d6a3" (UID: "a13c65f3-ffed-4345-a182-05e8c949d6a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.010621 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13c65f3-ffed-4345-a182-05e8c949d6a3-kube-api-access-b269l" (OuterVolumeSpecName: "kube-api-access-b269l") pod "a13c65f3-ffed-4345-a182-05e8c949d6a3" (UID: "a13c65f3-ffed-4345-a182-05e8c949d6a3"). InnerVolumeSpecName "kube-api-access-b269l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.050628 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-676d96fbc7-6xfzf"] Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.108273 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.108312 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.108325 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.108336 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.108347 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b269l\" (UniqueName: \"kubernetes.io/projected/a13c65f3-ffed-4345-a182-05e8c949d6a3-kube-api-access-b269l\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.108358 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a13c65f3-ffed-4345-a182-05e8c949d6a3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.367910 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6d674cb658-5bj7r"] Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.504851 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-55d57ffb96-9nv59"] Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.626015 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-dtltj"] Dec 08 20:24:24 crc kubenswrapper[4781]: W1208 20:24:24.636411 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20fec9e5_0d25_4c44_b756_b163add48592.slice/crio-6efe0e059ef03846acd9b979b140a228319bebfc010094c07203bc86ef5d307e WatchSource:0}: Error finding container 6efe0e059ef03846acd9b979b140a228319bebfc010094c07203bc86ef5d307e: Status 404 returned error can't find the container with id 6efe0e059ef03846acd9b979b140a228319bebfc010094c07203bc86ef5d307e Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.676492 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.808863 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 08 20:24:24 crc kubenswrapper[4781]: W1208 20:24:24.837510 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3aeb9976_ac47_4b7f_abc6_ce64e78619c5.slice/crio-8ce20ff089415cef84496c9aa2894fc9371264c4c743fe6b75d89b3e79279cf1 WatchSource:0}: Error finding container 8ce20ff089415cef84496c9aa2894fc9371264c4c743fe6b75d89b3e79279cf1: Status 404 returned error can't find the container with id 8ce20ff089415cef84496c9aa2894fc9371264c4c743fe6b75d89b3e79279cf1 Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.911782 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3aeb9976-ac47-4b7f-abc6-ce64e78619c5","Type":"ContainerStarted","Data":"8ce20ff089415cef84496c9aa2894fc9371264c4c743fe6b75d89b3e79279cf1"} Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.919220 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" event={"ID":"dcaa98ab-e000-41ef-bf28-189680138c66","Type":"ContainerStarted","Data":"94501999d300f638868ca6da85c7aad3d353b7d68b5413817b4697c6f9039697"} Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.926603 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec3cd201-8baf-495f-aef9-91043bdfb8cb","Type":"ContainerStarted","Data":"93b85babd040f8fda4618fe986a7283b71eabe156e0d9bbc829f9042f005c2f1"} Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.928031 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-676d96fbc7-6xfzf" event={"ID":"558023e1-d94c-4422-a958-796ba9bf387f","Type":"ContainerStarted","Data":"7e0d50f09623bbb6ae392bb941bf6c7d077cb912e4c2acf660a7e60ec826f5c8"} Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.932781 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" event={"ID":"20fec9e5-0d25-4c44-b756-b163add48592","Type":"ContainerStarted","Data":"6efe0e059ef03846acd9b979b140a228319bebfc010094c07203bc86ef5d307e"} Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.939183 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65dd957765-st75q" Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.940106 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55d57ffb96-9nv59" event={"ID":"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3","Type":"ContainerStarted","Data":"2ce8da86a5c34c4bde70c84dab43dd46393f6d7dd4e26d5c08d6b8210f48d74b"} Dec 08 20:24:24 crc kubenswrapper[4781]: I1208 20:24:24.940225 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55d57ffb96-9nv59" event={"ID":"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3","Type":"ContainerStarted","Data":"cef4a7b5ddd92c6dfbf933ba7eb649f6359a16ae961c0a953f3cca26ac9cb126"} Dec 08 20:24:25 crc kubenswrapper[4781]: I1208 20:24:25.071209 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-st75q"] Dec 08 20:24:25 crc kubenswrapper[4781]: I1208 20:24:25.089871 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-st75q"] Dec 08 20:24:25 crc kubenswrapper[4781]: I1208 20:24:25.654426 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 08 20:24:25 crc kubenswrapper[4781]: I1208 20:24:25.961896 4781 generic.go:334] "Generic (PLEG): container finished" podID="20fec9e5-0d25-4c44-b756-b163add48592" containerID="87ce7ddc2aae93096a51125a15c056f85ca7022dde8f3a864a82ebc8e08b9793" exitCode=0 Dec 08 20:24:25 crc kubenswrapper[4781]: I1208 20:24:25.962050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" event={"ID":"20fec9e5-0d25-4c44-b756-b163add48592","Type":"ContainerDied","Data":"87ce7ddc2aae93096a51125a15c056f85ca7022dde8f3a864a82ebc8e08b9793"} Dec 08 20:24:25 crc kubenswrapper[4781]: I1208 20:24:25.968992 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55d57ffb96-9nv59" event={"ID":"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3","Type":"ContainerStarted","Data":"625ea809d498c0e401b8086b5ac3d0a4cdf7662262e4daa59b098426c120d4e1"} Dec 08 20:24:25 crc kubenswrapper[4781]: I1208 20:24:25.970873 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:25 crc kubenswrapper[4781]: I1208 20:24:25.970970 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:25 crc kubenswrapper[4781]: I1208 20:24:25.979671 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3aeb9976-ac47-4b7f-abc6-ce64e78619c5","Type":"ContainerStarted","Data":"ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749"} Dec 08 20:24:26 crc kubenswrapper[4781]: I1208 20:24:26.016907 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-55d57ffb96-9nv59" podStartSLOduration=3.01688836 podStartE2EDuration="3.01688836s" podCreationTimestamp="2025-12-08 20:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:26.003141115 +0000 UTC m=+1182.154424512" watchObservedRunningTime="2025-12-08 20:24:26.01688836 +0000 UTC m=+1182.168171737" Dec 08 20:24:26 crc kubenswrapper[4781]: I1208 20:24:26.151660 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13c65f3-ffed-4345-a182-05e8c949d6a3" path="/var/lib/kubelet/pods/a13c65f3-ffed-4345-a182-05e8c949d6a3/volumes" Dec 08 20:24:26 crc kubenswrapper[4781]: I1208 20:24:26.995873 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec3cd201-8baf-495f-aef9-91043bdfb8cb","Type":"ContainerStarted","Data":"ba7bb728b5a05e6aa70858a1a806699c5cf36c1252e84dd82d3b6399e9e54da3"} Dec 08 20:24:27 crc kubenswrapper[4781]: I1208 20:24:27.000071 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" event={"ID":"20fec9e5-0d25-4c44-b756-b163add48592","Type":"ContainerStarted","Data":"6b0f9a288544f4a7fae5a2cafc45ed9c8903e1c6629e11c921c829605067917a"} Dec 08 20:24:27 crc kubenswrapper[4781]: I1208 20:24:27.000214 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:27 crc kubenswrapper[4781]: I1208 20:24:27.002607 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3aeb9976-ac47-4b7f-abc6-ce64e78619c5","Type":"ContainerStarted","Data":"8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955"} Dec 08 20:24:27 crc kubenswrapper[4781]: I1208 20:24:27.002950 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3aeb9976-ac47-4b7f-abc6-ce64e78619c5" containerName="cinder-api-log" containerID="cri-o://ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749" gracePeriod=30 Dec 08 20:24:27 crc kubenswrapper[4781]: I1208 20:24:27.002971 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3aeb9976-ac47-4b7f-abc6-ce64e78619c5" containerName="cinder-api" containerID="cri-o://8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955" gracePeriod=30 Dec 08 20:24:27 crc kubenswrapper[4781]: I1208 20:24:27.029378 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" podStartSLOduration=4.029357072 podStartE2EDuration="4.029357072s" podCreationTimestamp="2025-12-08 20:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:27.020043384 +0000 UTC m=+1183.171326781" watchObservedRunningTime="2025-12-08 20:24:27.029357072 +0000 UTC m=+1183.180640449" Dec 08 20:24:27 crc kubenswrapper[4781]: I1208 20:24:27.039708 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.039693219 podStartE2EDuration="4.039693219s" podCreationTimestamp="2025-12-08 20:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:27.038458133 +0000 UTC m=+1183.189741510" watchObservedRunningTime="2025-12-08 20:24:27.039693219 +0000 UTC m=+1183.190976596" Dec 08 20:24:27 crc kubenswrapper[4781]: I1208 20:24:27.930218 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.052383 4781 generic.go:334] "Generic (PLEG): container finished" podID="3aeb9976-ac47-4b7f-abc6-ce64e78619c5" containerID="8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955" exitCode=0 Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.052418 4781 generic.go:334] "Generic (PLEG): container finished" podID="3aeb9976-ac47-4b7f-abc6-ce64e78619c5" containerID="ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749" exitCode=143 Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.052497 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.052508 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3aeb9976-ac47-4b7f-abc6-ce64e78619c5","Type":"ContainerDied","Data":"8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955"} Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.052536 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3aeb9976-ac47-4b7f-abc6-ce64e78619c5","Type":"ContainerDied","Data":"ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749"} Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.052547 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3aeb9976-ac47-4b7f-abc6-ce64e78619c5","Type":"ContainerDied","Data":"8ce20ff089415cef84496c9aa2894fc9371264c4c743fe6b75d89b3e79279cf1"} Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.052571 4781 scope.go:117] "RemoveContainer" containerID="8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.087008 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" event={"ID":"dcaa98ab-e000-41ef-bf28-189680138c66","Type":"ContainerStarted","Data":"aea871cf91c236aaaab1bd97fcc125964cc5b71b9fb6df75b3aaa1564aee82c9"} Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.087830 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-scripts\") pod \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.088000 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-config-data-custom\") pod \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.088064 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-logs\") pod \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.088113 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-etc-machine-id\") pod \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.088178 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-combined-ca-bundle\") pod \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.088210 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-config-data\") pod \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.088248 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6q4w\" (UniqueName: \"kubernetes.io/projected/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-kube-api-access-v6q4w\") pod \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\" (UID: \"3aeb9976-ac47-4b7f-abc6-ce64e78619c5\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.089711 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-676d96fbc7-6xfzf" event={"ID":"558023e1-d94c-4422-a958-796ba9bf387f","Type":"ContainerStarted","Data":"b591dc68be1b1d953e751d98f1a4bf5b4fb66f5bb76851632b254e0b56eff258"} Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.090556 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-logs" (OuterVolumeSpecName: "logs") pod "3aeb9976-ac47-4b7f-abc6-ce64e78619c5" (UID: "3aeb9976-ac47-4b7f-abc6-ce64e78619c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.090993 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3aeb9976-ac47-4b7f-abc6-ce64e78619c5" (UID: "3aeb9976-ac47-4b7f-abc6-ce64e78619c5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.094936 4781 scope.go:117] "RemoveContainer" containerID="ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.098443 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-kube-api-access-v6q4w" (OuterVolumeSpecName: "kube-api-access-v6q4w") pod "3aeb9976-ac47-4b7f-abc6-ce64e78619c5" (UID: "3aeb9976-ac47-4b7f-abc6-ce64e78619c5"). InnerVolumeSpecName "kube-api-access-v6q4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.101499 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-scripts" (OuterVolumeSpecName: "scripts") pod "3aeb9976-ac47-4b7f-abc6-ce64e78619c5" (UID: "3aeb9976-ac47-4b7f-abc6-ce64e78619c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.112090 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3aeb9976-ac47-4b7f-abc6-ce64e78619c5" (UID: "3aeb9976-ac47-4b7f-abc6-ce64e78619c5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.146000 4781 scope.go:117] "RemoveContainer" containerID="8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955" Dec 08 20:24:28 crc kubenswrapper[4781]: E1208 20:24:28.146444 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955\": container with ID starting with 8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955 not found: ID does not exist" containerID="8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.146480 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955"} err="failed to get container status \"8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955\": rpc error: code = NotFound desc = could not find container \"8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955\": container with ID starting with 8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955 not found: ID does not exist" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.146501 4781 scope.go:117] "RemoveContainer" containerID="ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749" Dec 08 20:24:28 crc kubenswrapper[4781]: E1208 20:24:28.147005 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749\": container with ID starting with ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749 not found: ID does not exist" containerID="ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.147029 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749"} err="failed to get container status \"ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749\": rpc error: code = NotFound desc = could not find container \"ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749\": container with ID starting with ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749 not found: ID does not exist" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.147042 4781 scope.go:117] "RemoveContainer" containerID="8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.147476 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955"} err="failed to get container status \"8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955\": rpc error: code = NotFound desc = could not find container \"8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955\": container with ID starting with 8dfb3fcfff19f11bf34fd056606b65824d739942313ce0aee4ff48bbd49cd955 not found: ID does not exist" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.147500 4781 scope.go:117] "RemoveContainer" containerID="ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.152180 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749"} err="failed to get container status \"ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749\": rpc error: code = NotFound desc = could not find container \"ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749\": container with ID starting with ed6656fe4ed9d3faec64b7373dfdee70f46c21653972d9b48273f602c91e9749 not found: ID does not exist" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.190047 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6q4w\" (UniqueName: \"kubernetes.io/projected/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-kube-api-access-v6q4w\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.190083 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.190096 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.190303 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.191427 4781 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.201219 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3aeb9976-ac47-4b7f-abc6-ce64e78619c5" (UID: "3aeb9976-ac47-4b7f-abc6-ce64e78619c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.241831 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-config-data" (OuterVolumeSpecName: "config-data") pod "3aeb9976-ac47-4b7f-abc6-ce64e78619c5" (UID: "3aeb9976-ac47-4b7f-abc6-ce64e78619c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.294363 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.294406 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3aeb9976-ac47-4b7f-abc6-ce64e78619c5-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.397015 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.419982 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.436436 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 08 20:24:28 crc kubenswrapper[4781]: E1208 20:24:28.436947 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aeb9976-ac47-4b7f-abc6-ce64e78619c5" containerName="cinder-api" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.436975 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aeb9976-ac47-4b7f-abc6-ce64e78619c5" containerName="cinder-api" Dec 08 20:24:28 crc kubenswrapper[4781]: E1208 20:24:28.437003 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aeb9976-ac47-4b7f-abc6-ce64e78619c5" containerName="cinder-api-log" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.437012 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aeb9976-ac47-4b7f-abc6-ce64e78619c5" containerName="cinder-api-log" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.437308 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aeb9976-ac47-4b7f-abc6-ce64e78619c5" containerName="cinder-api" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.437340 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aeb9976-ac47-4b7f-abc6-ce64e78619c5" containerName="cinder-api-log" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.439383 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.442155 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.445163 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.448973 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.449128 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.499162 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.499540 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.499601 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.499641 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-scripts\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.499673 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-config-data\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.499715 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.499751 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-logs\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.499794 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jk6w\" (UniqueName: \"kubernetes.io/projected/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-kube-api-access-7jk6w\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.499826 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-config-data-custom\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.565508 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.601358 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-scripts\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.601408 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-config-data\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.601445 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.601498 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-logs\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.601551 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jk6w\" (UniqueName: \"kubernetes.io/projected/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-kube-api-access-7jk6w\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.601622 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-config-data-custom\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.601689 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.602149 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-logs\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.602314 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.602377 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.602462 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.606383 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.606492 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-config-data-custom\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.613630 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-scripts\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.613792 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.617721 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.623015 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-config-data\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.625459 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jk6w\" (UniqueName: \"kubernetes.io/projected/5c1478d5-3e67-4d45-b4c3-d5612e46db8d-kube-api-access-7jk6w\") pod \"cinder-api-0\" (UID: \"5c1478d5-3e67-4d45-b4c3-d5612e46db8d\") " pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.703859 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-scripts\") pod \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.703947 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-sg-core-conf-yaml\") pod \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.703984 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-combined-ca-bundle\") pod \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.704028 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb9fz\" (UniqueName: \"kubernetes.io/projected/da20c49c-5d8d-4701-a54b-c6ae7b6670db-kube-api-access-gb9fz\") pod \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.704128 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da20c49c-5d8d-4701-a54b-c6ae7b6670db-run-httpd\") pod \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.704208 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da20c49c-5d8d-4701-a54b-c6ae7b6670db-log-httpd\") pod \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.704267 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-config-data\") pod \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\" (UID: \"da20c49c-5d8d-4701-a54b-c6ae7b6670db\") " Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.705454 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da20c49c-5d8d-4701-a54b-c6ae7b6670db-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "da20c49c-5d8d-4701-a54b-c6ae7b6670db" (UID: "da20c49c-5d8d-4701-a54b-c6ae7b6670db"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.705987 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da20c49c-5d8d-4701-a54b-c6ae7b6670db-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "da20c49c-5d8d-4701-a54b-c6ae7b6670db" (UID: "da20c49c-5d8d-4701-a54b-c6ae7b6670db"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.709544 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da20c49c-5d8d-4701-a54b-c6ae7b6670db-kube-api-access-gb9fz" (OuterVolumeSpecName: "kube-api-access-gb9fz") pod "da20c49c-5d8d-4701-a54b-c6ae7b6670db" (UID: "da20c49c-5d8d-4701-a54b-c6ae7b6670db"). InnerVolumeSpecName "kube-api-access-gb9fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.712592 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-scripts" (OuterVolumeSpecName: "scripts") pod "da20c49c-5d8d-4701-a54b-c6ae7b6670db" (UID: "da20c49c-5d8d-4701-a54b-c6ae7b6670db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.736263 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "da20c49c-5d8d-4701-a54b-c6ae7b6670db" (UID: "da20c49c-5d8d-4701-a54b-c6ae7b6670db"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.762656 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da20c49c-5d8d-4701-a54b-c6ae7b6670db" (UID: "da20c49c-5d8d-4701-a54b-c6ae7b6670db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.806626 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da20c49c-5d8d-4701-a54b-c6ae7b6670db-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.806661 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.806673 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.806684 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.806695 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb9fz\" (UniqueName: \"kubernetes.io/projected/da20c49c-5d8d-4701-a54b-c6ae7b6670db-kube-api-access-gb9fz\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.806708 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/da20c49c-5d8d-4701-a54b-c6ae7b6670db-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.811572 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-config-data" (OuterVolumeSpecName: "config-data") pod "da20c49c-5d8d-4701-a54b-c6ae7b6670db" (UID: "da20c49c-5d8d-4701-a54b-c6ae7b6670db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.862194 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 08 20:24:28 crc kubenswrapper[4781]: I1208 20:24:28.908518 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da20c49c-5d8d-4701-a54b-c6ae7b6670db-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.102786 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" event={"ID":"dcaa98ab-e000-41ef-bf28-189680138c66","Type":"ContainerStarted","Data":"2b43ded418c1ffcc46609fdf289968719b2c042e3e2fc2337c29dbd5ef8481cd"} Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.106878 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec3cd201-8baf-495f-aef9-91043bdfb8cb","Type":"ContainerStarted","Data":"4f849e3508d2f57774312883cc52d561f8901c9a4d4a6e24e5628850d08abb43"} Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.109784 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-676d96fbc7-6xfzf" event={"ID":"558023e1-d94c-4422-a958-796ba9bf387f","Type":"ContainerStarted","Data":"281cad4b504c4057b137eb52e38c5587cb19a65bc05138b4cafb5aa9f1669d78"} Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.112882 4781 generic.go:334] "Generic (PLEG): container finished" podID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerID="974e260e1570724d32fa3dcd72d912b973dc9aa1dd8af3ec6d69a485f2f8ef7d" exitCode=0 Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.112942 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da20c49c-5d8d-4701-a54b-c6ae7b6670db","Type":"ContainerDied","Data":"974e260e1570724d32fa3dcd72d912b973dc9aa1dd8af3ec6d69a485f2f8ef7d"} Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.112972 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"da20c49c-5d8d-4701-a54b-c6ae7b6670db","Type":"ContainerDied","Data":"e67454e5f5b46af58d24c6f9f08ad90daba19b463d3f3259d2aefbcddd4d2ad7"} Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.112990 4781 scope.go:117] "RemoveContainer" containerID="14e474036c862f125743f77505bf07ce0937ce075b267ea59eaa45a591286fe1" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.113030 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.155235 4781 scope.go:117] "RemoveContainer" containerID="1a652fde42636178fb56e7d1e12ac960771f9ccc3be5d046923e413b14c4494d" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.163283 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-676d96fbc7-6xfzf" podStartSLOduration=2.751595247 podStartE2EDuration="6.16307734s" podCreationTimestamp="2025-12-08 20:24:23 +0000 UTC" firstStartedPulling="2025-12-08 20:24:24.057895522 +0000 UTC m=+1180.209178889" lastFinishedPulling="2025-12-08 20:24:27.469377605 +0000 UTC m=+1183.620660982" observedRunningTime="2025-12-08 20:24:29.14533909 +0000 UTC m=+1185.296622487" watchObservedRunningTime="2025-12-08 20:24:29.16307734 +0000 UTC m=+1185.314360717" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.168504 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6d674cb658-5bj7r" podStartSLOduration=3.075216976 podStartE2EDuration="6.168484945s" podCreationTimestamp="2025-12-08 20:24:23 +0000 UTC" firstStartedPulling="2025-12-08 20:24:24.392087865 +0000 UTC m=+1180.543371242" lastFinishedPulling="2025-12-08 20:24:27.485355824 +0000 UTC m=+1183.636639211" observedRunningTime="2025-12-08 20:24:29.126727675 +0000 UTC m=+1185.278011052" watchObservedRunningTime="2025-12-08 20:24:29.168484945 +0000 UTC m=+1185.319768322" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.185728 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.191142 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.979699327 podStartE2EDuration="6.191120455s" podCreationTimestamp="2025-12-08 20:24:23 +0000 UTC" firstStartedPulling="2025-12-08 20:24:24.678908806 +0000 UTC m=+1180.830192183" lastFinishedPulling="2025-12-08 20:24:25.890329934 +0000 UTC m=+1182.041613311" observedRunningTime="2025-12-08 20:24:29.170884434 +0000 UTC m=+1185.322167831" watchObservedRunningTime="2025-12-08 20:24:29.191120455 +0000 UTC m=+1185.342403832" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.244155 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.267782 4781 scope.go:117] "RemoveContainer" containerID="974e260e1570724d32fa3dcd72d912b973dc9aa1dd8af3ec6d69a485f2f8ef7d" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.269728 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.280696 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:29 crc kubenswrapper[4781]: E1208 20:24:29.281103 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerName="sg-core" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.281119 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerName="sg-core" Dec 08 20:24:29 crc kubenswrapper[4781]: E1208 20:24:29.281138 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerName="ceilometer-notification-agent" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.281145 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerName="ceilometer-notification-agent" Dec 08 20:24:29 crc kubenswrapper[4781]: E1208 20:24:29.281171 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerName="proxy-httpd" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.281178 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerName="proxy-httpd" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.281363 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerName="proxy-httpd" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.281377 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerName="sg-core" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.281390 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" containerName="ceilometer-notification-agent" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.283017 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.295873 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.296001 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.315301 4781 scope.go:117] "RemoveContainer" containerID="14e474036c862f125743f77505bf07ce0937ce075b267ea59eaa45a591286fe1" Dec 08 20:24:29 crc kubenswrapper[4781]: E1208 20:24:29.318260 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e474036c862f125743f77505bf07ce0937ce075b267ea59eaa45a591286fe1\": container with ID starting with 14e474036c862f125743f77505bf07ce0937ce075b267ea59eaa45a591286fe1 not found: ID does not exist" containerID="14e474036c862f125743f77505bf07ce0937ce075b267ea59eaa45a591286fe1" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.318318 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e474036c862f125743f77505bf07ce0937ce075b267ea59eaa45a591286fe1"} err="failed to get container status \"14e474036c862f125743f77505bf07ce0937ce075b267ea59eaa45a591286fe1\": rpc error: code = NotFound desc = could not find container \"14e474036c862f125743f77505bf07ce0937ce075b267ea59eaa45a591286fe1\": container with ID starting with 14e474036c862f125743f77505bf07ce0937ce075b267ea59eaa45a591286fe1 not found: ID does not exist" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.318339 4781 scope.go:117] "RemoveContainer" containerID="1a652fde42636178fb56e7d1e12ac960771f9ccc3be5d046923e413b14c4494d" Dec 08 20:24:29 crc kubenswrapper[4781]: E1208 20:24:29.319645 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a652fde42636178fb56e7d1e12ac960771f9ccc3be5d046923e413b14c4494d\": container with ID starting with 1a652fde42636178fb56e7d1e12ac960771f9ccc3be5d046923e413b14c4494d not found: ID does not exist" containerID="1a652fde42636178fb56e7d1e12ac960771f9ccc3be5d046923e413b14c4494d" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.319668 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a652fde42636178fb56e7d1e12ac960771f9ccc3be5d046923e413b14c4494d"} err="failed to get container status \"1a652fde42636178fb56e7d1e12ac960771f9ccc3be5d046923e413b14c4494d\": rpc error: code = NotFound desc = could not find container \"1a652fde42636178fb56e7d1e12ac960771f9ccc3be5d046923e413b14c4494d\": container with ID starting with 1a652fde42636178fb56e7d1e12ac960771f9ccc3be5d046923e413b14c4494d not found: ID does not exist" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.319683 4781 scope.go:117] "RemoveContainer" containerID="974e260e1570724d32fa3dcd72d912b973dc9aa1dd8af3ec6d69a485f2f8ef7d" Dec 08 20:24:29 crc kubenswrapper[4781]: E1208 20:24:29.320056 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"974e260e1570724d32fa3dcd72d912b973dc9aa1dd8af3ec6d69a485f2f8ef7d\": container with ID starting with 974e260e1570724d32fa3dcd72d912b973dc9aa1dd8af3ec6d69a485f2f8ef7d not found: ID does not exist" containerID="974e260e1570724d32fa3dcd72d912b973dc9aa1dd8af3ec6d69a485f2f8ef7d" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.320074 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"974e260e1570724d32fa3dcd72d912b973dc9aa1dd8af3ec6d69a485f2f8ef7d"} err="failed to get container status \"974e260e1570724d32fa3dcd72d912b973dc9aa1dd8af3ec6d69a485f2f8ef7d\": rpc error: code = NotFound desc = could not find container \"974e260e1570724d32fa3dcd72d912b973dc9aa1dd8af3ec6d69a485f2f8ef7d\": container with ID starting with 974e260e1570724d32fa3dcd72d912b973dc9aa1dd8af3ec6d69a485f2f8ef7d not found: ID does not exist" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.347392 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.377684 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.428361 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-log-httpd\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.428427 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.428479 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-scripts\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.428525 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.428562 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-config-data\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.428646 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc4zf\" (UniqueName: \"kubernetes.io/projected/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-kube-api-access-gc4zf\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.428763 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-run-httpd\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.532001 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc4zf\" (UniqueName: \"kubernetes.io/projected/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-kube-api-access-gc4zf\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.532386 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-run-httpd\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.532439 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-log-httpd\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.532474 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.532517 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-scripts\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.532562 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.532596 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-config-data\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.533311 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-run-httpd\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.533669 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-log-httpd\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.543363 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.547074 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.547646 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-scripts\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.548069 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-config-data\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.552567 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc4zf\" (UniqueName: \"kubernetes.io/projected/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-kube-api-access-gc4zf\") pod \"ceilometer-0\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.601361 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.875657 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56858f8966-hkmxw"] Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.878241 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.898753 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56858f8966-hkmxw"] Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.928562 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.928767 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.943028 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-internal-tls-certs\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.943090 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljxkb\" (UniqueName: \"kubernetes.io/projected/ed36d878-8827-467c-95a0-1450798ad50e-kube-api-access-ljxkb\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.943121 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed36d878-8827-467c-95a0-1450798ad50e-logs\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.943218 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-combined-ca-bundle\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.943249 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-config-data-custom\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.943282 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-public-tls-certs\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:29 crc kubenswrapper[4781]: I1208 20:24:29.943303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-config-data\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.018235 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.048200 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-internal-tls-certs\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.048266 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljxkb\" (UniqueName: \"kubernetes.io/projected/ed36d878-8827-467c-95a0-1450798ad50e-kube-api-access-ljxkb\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.048294 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed36d878-8827-467c-95a0-1450798ad50e-logs\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.048424 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-combined-ca-bundle\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.048455 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-config-data-custom\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.048489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-public-tls-certs\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.048505 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-config-data\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.049836 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed36d878-8827-467c-95a0-1450798ad50e-logs\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.057451 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-public-tls-certs\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.057467 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-combined-ca-bundle\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.058240 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-config-data-custom\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.059307 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-internal-tls-certs\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.059840 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed36d878-8827-467c-95a0-1450798ad50e-config-data\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.078461 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljxkb\" (UniqueName: \"kubernetes.io/projected/ed36d878-8827-467c-95a0-1450798ad50e-kube-api-access-ljxkb\") pod \"barbican-api-56858f8966-hkmxw\" (UID: \"ed36d878-8827-467c-95a0-1450798ad50e\") " pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.150400 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aeb9976-ac47-4b7f-abc6-ce64e78619c5" path="/var/lib/kubelet/pods/3aeb9976-ac47-4b7f-abc6-ce64e78619c5/volumes" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.151339 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/965ff155-f5c6-4c49-9d46-18a62ef94308-logs\") pod \"965ff155-f5c6-4c49-9d46-18a62ef94308\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.152149 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/965ff155-f5c6-4c49-9d46-18a62ef94308-scripts\") pod \"965ff155-f5c6-4c49-9d46-18a62ef94308\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.152667 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/965ff155-f5c6-4c49-9d46-18a62ef94308-config-data\") pod \"965ff155-f5c6-4c49-9d46-18a62ef94308\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.152851 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/965ff155-f5c6-4c49-9d46-18a62ef94308-horizon-secret-key\") pod \"965ff155-f5c6-4c49-9d46-18a62ef94308\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.152246 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/965ff155-f5c6-4c49-9d46-18a62ef94308-logs" (OuterVolumeSpecName: "logs") pod "965ff155-f5c6-4c49-9d46-18a62ef94308" (UID: "965ff155-f5c6-4c49-9d46-18a62ef94308"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.152998 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flw9b\" (UniqueName: \"kubernetes.io/projected/965ff155-f5c6-4c49-9d46-18a62ef94308-kube-api-access-flw9b\") pod \"965ff155-f5c6-4c49-9d46-18a62ef94308\" (UID: \"965ff155-f5c6-4c49-9d46-18a62ef94308\") " Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.153979 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/965ff155-f5c6-4c49-9d46-18a62ef94308-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.154779 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da20c49c-5d8d-4701-a54b-c6ae7b6670db" path="/var/lib/kubelet/pods/da20c49c-5d8d-4701-a54b-c6ae7b6670db/volumes" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.158527 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/965ff155-f5c6-4c49-9d46-18a62ef94308-kube-api-access-flw9b" (OuterVolumeSpecName: "kube-api-access-flw9b") pod "965ff155-f5c6-4c49-9d46-18a62ef94308" (UID: "965ff155-f5c6-4c49-9d46-18a62ef94308"). InnerVolumeSpecName "kube-api-access-flw9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.161829 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/965ff155-f5c6-4c49-9d46-18a62ef94308-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "965ff155-f5c6-4c49-9d46-18a62ef94308" (UID: "965ff155-f5c6-4c49-9d46-18a62ef94308"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.162412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5c1478d5-3e67-4d45-b4c3-d5612e46db8d","Type":"ContainerStarted","Data":"5055556de737184ebb3491394c89f44ba4c0542b9184149efa162f4cc412b555"} Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.180544 4781 generic.go:334] "Generic (PLEG): container finished" podID="965ff155-f5c6-4c49-9d46-18a62ef94308" containerID="c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc" exitCode=137 Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.180581 4781 generic.go:334] "Generic (PLEG): container finished" podID="965ff155-f5c6-4c49-9d46-18a62ef94308" containerID="63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd" exitCode=137 Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.180660 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594cfd6f5c-pppkp" event={"ID":"965ff155-f5c6-4c49-9d46-18a62ef94308","Type":"ContainerDied","Data":"c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc"} Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.180686 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594cfd6f5c-pppkp" event={"ID":"965ff155-f5c6-4c49-9d46-18a62ef94308","Type":"ContainerDied","Data":"63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd"} Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.180696 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594cfd6f5c-pppkp" event={"ID":"965ff155-f5c6-4c49-9d46-18a62ef94308","Type":"ContainerDied","Data":"9222305ff5447d60724e8dc626561bdb321cbe8de9906873a5a8590e54b90779"} Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.180711 4781 scope.go:117] "RemoveContainer" containerID="c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.181547 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594cfd6f5c-pppkp" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.195579 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/965ff155-f5c6-4c49-9d46-18a62ef94308-config-data" (OuterVolumeSpecName: "config-data") pod "965ff155-f5c6-4c49-9d46-18a62ef94308" (UID: "965ff155-f5c6-4c49-9d46-18a62ef94308"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.205235 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/965ff155-f5c6-4c49-9d46-18a62ef94308-scripts" (OuterVolumeSpecName: "scripts") pod "965ff155-f5c6-4c49-9d46-18a62ef94308" (UID: "965ff155-f5c6-4c49-9d46-18a62ef94308"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.217399 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.255770 4781 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/965ff155-f5c6-4c49-9d46-18a62ef94308-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.256767 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flw9b\" (UniqueName: \"kubernetes.io/projected/965ff155-f5c6-4c49-9d46-18a62ef94308-kube-api-access-flw9b\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.256881 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/965ff155-f5c6-4c49-9d46-18a62ef94308-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.256991 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/965ff155-f5c6-4c49-9d46-18a62ef94308-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.389714 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.442757 4781 scope.go:117] "RemoveContainer" containerID="63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.622719 4781 scope.go:117] "RemoveContainer" containerID="c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc" Dec 08 20:24:30 crc kubenswrapper[4781]: E1208 20:24:30.626190 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc\": container with ID starting with c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc not found: ID does not exist" containerID="c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.626363 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc"} err="failed to get container status \"c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc\": rpc error: code = NotFound desc = could not find container \"c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc\": container with ID starting with c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc not found: ID does not exist" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.626390 4781 scope.go:117] "RemoveContainer" containerID="63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd" Dec 08 20:24:30 crc kubenswrapper[4781]: E1208 20:24:30.626965 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd\": container with ID starting with 63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd not found: ID does not exist" containerID="63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.626991 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd"} err="failed to get container status \"63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd\": rpc error: code = NotFound desc = could not find container \"63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd\": container with ID starting with 63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd not found: ID does not exist" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.627583 4781 scope.go:117] "RemoveContainer" containerID="c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.628272 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc"} err="failed to get container status \"c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc\": rpc error: code = NotFound desc = could not find container \"c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc\": container with ID starting with c43bafcd98f8569e31384af758b81d6b069c97515af040aca06ef47229f057bc not found: ID does not exist" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.628291 4781 scope.go:117] "RemoveContainer" containerID="63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.628859 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd"} err="failed to get container status \"63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd\": rpc error: code = NotFound desc = could not find container \"63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd\": container with ID starting with 63bc7d124bfa80d610235b9de0a3851ab6dfdeca63e4d2bed8849ea7063989cd not found: ID does not exist" Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.628930 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-594cfd6f5c-pppkp"] Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.645581 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-594cfd6f5c-pppkp"] Dec 08 20:24:30 crc kubenswrapper[4781]: I1208 20:24:30.818812 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56858f8966-hkmxw"] Dec 08 20:24:30 crc kubenswrapper[4781]: W1208 20:24:30.825540 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded36d878_8827_467c_95a0_1450798ad50e.slice/crio-d8a35e7a442f5ee26491b7143da2ef21c1ba5d56a1716633b49ab4a242d92124 WatchSource:0}: Error finding container d8a35e7a442f5ee26491b7143da2ef21c1ba5d56a1716633b49ab4a242d92124: Status 404 returned error can't find the container with id d8a35e7a442f5ee26491b7143da2ef21c1ba5d56a1716633b49ab4a242d92124 Dec 08 20:24:31 crc kubenswrapper[4781]: I1208 20:24:31.206105 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56858f8966-hkmxw" event={"ID":"ed36d878-8827-467c-95a0-1450798ad50e","Type":"ContainerStarted","Data":"8de9c427d0262b8d858a799ff9fe84f4af43abd904cca1be71617e438953f8c3"} Dec 08 20:24:31 crc kubenswrapper[4781]: I1208 20:24:31.206411 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56858f8966-hkmxw" event={"ID":"ed36d878-8827-467c-95a0-1450798ad50e","Type":"ContainerStarted","Data":"d8a35e7a442f5ee26491b7143da2ef21c1ba5d56a1716633b49ab4a242d92124"} Dec 08 20:24:31 crc kubenswrapper[4781]: I1208 20:24:31.208439 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea","Type":"ContainerStarted","Data":"4532f922a9104106b8fcfaab2f400ba80870604c911a442fb45360ed916ada7e"} Dec 08 20:24:31 crc kubenswrapper[4781]: I1208 20:24:31.208461 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea","Type":"ContainerStarted","Data":"95aedae02f2eed831c0d0652f932670ae75eab4c8a22691951caeb65990441d9"} Dec 08 20:24:31 crc kubenswrapper[4781]: I1208 20:24:31.210446 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5c1478d5-3e67-4d45-b4c3-d5612e46db8d","Type":"ContainerStarted","Data":"6a50817fd2a6c95d82f8d88f3b037e0ed97b8bfbb1a8b9827ff5f96a57f2efd7"} Dec 08 20:24:31 crc kubenswrapper[4781]: I1208 20:24:31.607358 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b4c587749-kmfjj" Dec 08 20:24:31 crc kubenswrapper[4781]: I1208 20:24:31.707119 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-86dcc84d-7tchk"] Dec 08 20:24:31 crc kubenswrapper[4781]: I1208 20:24:31.707375 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-86dcc84d-7tchk" podUID="0c055aae-fa31-4d84-aa1b-b60c8829a61b" containerName="neutron-api" containerID="cri-o://7e363bf8d440b983fb1557f7e492872cdbde8d90bac4e9ac82381f3137f60aa8" gracePeriod=30 Dec 08 20:24:31 crc kubenswrapper[4781]: I1208 20:24:31.707888 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-86dcc84d-7tchk" podUID="0c055aae-fa31-4d84-aa1b-b60c8829a61b" containerName="neutron-httpd" containerID="cri-o://86263da5d3bd8712aa4874eedc1af5f784bac940470b3ce2c7cca3b984caed17" gracePeriod=30 Dec 08 20:24:32 crc kubenswrapper[4781]: I1208 20:24:32.159303 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="965ff155-f5c6-4c49-9d46-18a62ef94308" path="/var/lib/kubelet/pods/965ff155-f5c6-4c49-9d46-18a62ef94308/volumes" Dec 08 20:24:32 crc kubenswrapper[4781]: I1208 20:24:32.229685 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea","Type":"ContainerStarted","Data":"23541fb6775a7a3e1833ef55a404b04d07f4c2a4e1c2f965fc435a61169a8950"} Dec 08 20:24:32 crc kubenswrapper[4781]: I1208 20:24:32.232959 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"5c1478d5-3e67-4d45-b4c3-d5612e46db8d","Type":"ContainerStarted","Data":"4b3664a3b5c218719cb7bf0babc00a0ccab2fad9befbeddf6278fbce1fb0329d"} Dec 08 20:24:32 crc kubenswrapper[4781]: I1208 20:24:32.238143 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 08 20:24:32 crc kubenswrapper[4781]: I1208 20:24:32.240799 4781 generic.go:334] "Generic (PLEG): container finished" podID="0c055aae-fa31-4d84-aa1b-b60c8829a61b" containerID="86263da5d3bd8712aa4874eedc1af5f784bac940470b3ce2c7cca3b984caed17" exitCode=0 Dec 08 20:24:32 crc kubenswrapper[4781]: I1208 20:24:32.241345 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86dcc84d-7tchk" event={"ID":"0c055aae-fa31-4d84-aa1b-b60c8829a61b","Type":"ContainerDied","Data":"86263da5d3bd8712aa4874eedc1af5f784bac940470b3ce2c7cca3b984caed17"} Dec 08 20:24:32 crc kubenswrapper[4781]: I1208 20:24:32.262337 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56858f8966-hkmxw" event={"ID":"ed36d878-8827-467c-95a0-1450798ad50e","Type":"ContainerStarted","Data":"b43d6c677afc55fdae88e4538bd0beaf20be87a52e2235eacf2f130e0a794abf"} Dec 08 20:24:32 crc kubenswrapper[4781]: I1208 20:24:32.262609 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:32 crc kubenswrapper[4781]: I1208 20:24:32.263449 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:32 crc kubenswrapper[4781]: I1208 20:24:32.267621 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.267608973 podStartE2EDuration="4.267608973s" podCreationTimestamp="2025-12-08 20:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:32.262432194 +0000 UTC m=+1188.413715581" watchObservedRunningTime="2025-12-08 20:24:32.267608973 +0000 UTC m=+1188.418892350" Dec 08 20:24:32 crc kubenswrapper[4781]: I1208 20:24:32.295591 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56858f8966-hkmxw" podStartSLOduration=3.295572707 podStartE2EDuration="3.295572707s" podCreationTimestamp="2025-12-08 20:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:32.288298508 +0000 UTC m=+1188.439581875" watchObservedRunningTime="2025-12-08 20:24:32.295572707 +0000 UTC m=+1188.446856084" Dec 08 20:24:32 crc kubenswrapper[4781]: I1208 20:24:32.498270 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:24:32 crc kubenswrapper[4781]: I1208 20:24:32.741409 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.093118 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.239362 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-ovndb-tls-certs\") pod \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.239519 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q94vl\" (UniqueName: \"kubernetes.io/projected/0c055aae-fa31-4d84-aa1b-b60c8829a61b-kube-api-access-q94vl\") pod \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.239615 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-config\") pod \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.239723 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-combined-ca-bundle\") pod \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.239831 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-httpd-config\") pod \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\" (UID: \"0c055aae-fa31-4d84-aa1b-b60c8829a61b\") " Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.244898 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "0c055aae-fa31-4d84-aa1b-b60c8829a61b" (UID: "0c055aae-fa31-4d84-aa1b-b60c8829a61b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.245184 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c055aae-fa31-4d84-aa1b-b60c8829a61b-kube-api-access-q94vl" (OuterVolumeSpecName: "kube-api-access-q94vl") pod "0c055aae-fa31-4d84-aa1b-b60c8829a61b" (UID: "0c055aae-fa31-4d84-aa1b-b60c8829a61b"). InnerVolumeSpecName "kube-api-access-q94vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.290975 4781 generic.go:334] "Generic (PLEG): container finished" podID="0c055aae-fa31-4d84-aa1b-b60c8829a61b" containerID="7e363bf8d440b983fb1557f7e492872cdbde8d90bac4e9ac82381f3137f60aa8" exitCode=0 Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.291045 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86dcc84d-7tchk" event={"ID":"0c055aae-fa31-4d84-aa1b-b60c8829a61b","Type":"ContainerDied","Data":"7e363bf8d440b983fb1557f7e492872cdbde8d90bac4e9ac82381f3137f60aa8"} Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.291083 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86dcc84d-7tchk" event={"ID":"0c055aae-fa31-4d84-aa1b-b60c8829a61b","Type":"ContainerDied","Data":"c7102c5f9b95799652a4bb652fa11cdc5f0afcaeb9f4630de40ceccd22db4715"} Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.291101 4781 scope.go:117] "RemoveContainer" containerID="86263da5d3bd8712aa4874eedc1af5f784bac940470b3ce2c7cca3b984caed17" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.291229 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86dcc84d-7tchk" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.299405 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea","Type":"ContainerStarted","Data":"54127ec91fe42d9fab05a73ded10541e3b21826332319330b24cca9b11c51a34"} Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.304640 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c055aae-fa31-4d84-aa1b-b60c8829a61b" (UID: "0c055aae-fa31-4d84-aa1b-b60c8829a61b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.315765 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-config" (OuterVolumeSpecName: "config") pod "0c055aae-fa31-4d84-aa1b-b60c8829a61b" (UID: "0c055aae-fa31-4d84-aa1b-b60c8829a61b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.320662 4781 scope.go:117] "RemoveContainer" containerID="7e363bf8d440b983fb1557f7e492872cdbde8d90bac4e9ac82381f3137f60aa8" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.340085 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "0c055aae-fa31-4d84-aa1b-b60c8829a61b" (UID: "0c055aae-fa31-4d84-aa1b-b60c8829a61b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.342589 4781 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.342628 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q94vl\" (UniqueName: \"kubernetes.io/projected/0c055aae-fa31-4d84-aa1b-b60c8829a61b-kube-api-access-q94vl\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.342643 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.342655 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.342667 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0c055aae-fa31-4d84-aa1b-b60c8829a61b-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.344296 4781 scope.go:117] "RemoveContainer" containerID="86263da5d3bd8712aa4874eedc1af5f784bac940470b3ce2c7cca3b984caed17" Dec 08 20:24:33 crc kubenswrapper[4781]: E1208 20:24:33.347069 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86263da5d3bd8712aa4874eedc1af5f784bac940470b3ce2c7cca3b984caed17\": container with ID starting with 86263da5d3bd8712aa4874eedc1af5f784bac940470b3ce2c7cca3b984caed17 not found: ID does not exist" containerID="86263da5d3bd8712aa4874eedc1af5f784bac940470b3ce2c7cca3b984caed17" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.347393 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86263da5d3bd8712aa4874eedc1af5f784bac940470b3ce2c7cca3b984caed17"} err="failed to get container status \"86263da5d3bd8712aa4874eedc1af5f784bac940470b3ce2c7cca3b984caed17\": rpc error: code = NotFound desc = could not find container \"86263da5d3bd8712aa4874eedc1af5f784bac940470b3ce2c7cca3b984caed17\": container with ID starting with 86263da5d3bd8712aa4874eedc1af5f784bac940470b3ce2c7cca3b984caed17 not found: ID does not exist" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.347443 4781 scope.go:117] "RemoveContainer" containerID="7e363bf8d440b983fb1557f7e492872cdbde8d90bac4e9ac82381f3137f60aa8" Dec 08 20:24:33 crc kubenswrapper[4781]: E1208 20:24:33.348361 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e363bf8d440b983fb1557f7e492872cdbde8d90bac4e9ac82381f3137f60aa8\": container with ID starting with 7e363bf8d440b983fb1557f7e492872cdbde8d90bac4e9ac82381f3137f60aa8 not found: ID does not exist" containerID="7e363bf8d440b983fb1557f7e492872cdbde8d90bac4e9ac82381f3137f60aa8" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.348404 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e363bf8d440b983fb1557f7e492872cdbde8d90bac4e9ac82381f3137f60aa8"} err="failed to get container status \"7e363bf8d440b983fb1557f7e492872cdbde8d90bac4e9ac82381f3137f60aa8\": rpc error: code = NotFound desc = could not find container \"7e363bf8d440b983fb1557f7e492872cdbde8d90bac4e9ac82381f3137f60aa8\": container with ID starting with 7e363bf8d440b983fb1557f7e492872cdbde8d90bac4e9ac82381f3137f60aa8 not found: ID does not exist" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.625198 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-86dcc84d-7tchk"] Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.632875 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-86dcc84d-7tchk"] Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.870559 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 08 20:24:33 crc kubenswrapper[4781]: I1208 20:24:33.952084 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.047558 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-pmqdz"] Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.048271 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" podUID="f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" containerName="dnsmasq-dns" containerID="cri-o://6f09bcd45ea314fbb421767e43b183648dd8bf31e41ffae61eb661b526d1b414" gracePeriod=10 Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.238518 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c055aae-fa31-4d84-aa1b-b60c8829a61b" path="/var/lib/kubelet/pods/0c055aae-fa31-4d84-aa1b-b60c8829a61b/volumes" Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.244823 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.332374 4781 generic.go:334] "Generic (PLEG): container finished" podID="f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" containerID="6f09bcd45ea314fbb421767e43b183648dd8bf31e41ffae61eb661b526d1b414" exitCode=0 Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.332469 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" event={"ID":"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a","Type":"ContainerDied","Data":"6f09bcd45ea314fbb421767e43b183648dd8bf31e41ffae61eb661b526d1b414"} Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.395161 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.740374 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.894610 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-dns-swift-storage-0\") pod \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.894678 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-ovsdbserver-nb\") pod \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.894762 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-config\") pod \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.894783 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-ovsdbserver-sb\") pod \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.894854 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49vjw\" (UniqueName: \"kubernetes.io/projected/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-kube-api-access-49vjw\") pod \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.894950 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-dns-svc\") pod \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\" (UID: \"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a\") " Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.902443 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-kube-api-access-49vjw" (OuterVolumeSpecName: "kube-api-access-49vjw") pod "f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" (UID: "f65511c0-39b0-41f6-8f01-5e1d12aa5c9a"). InnerVolumeSpecName "kube-api-access-49vjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.972696 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" (UID: "f65511c0-39b0-41f6-8f01-5e1d12aa5c9a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.986542 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" (UID: "f65511c0-39b0-41f6-8f01-5e1d12aa5c9a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.995381 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-config" (OuterVolumeSpecName: "config") pod "f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" (UID: "f65511c0-39b0-41f6-8f01-5e1d12aa5c9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.996566 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.996582 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49vjw\" (UniqueName: \"kubernetes.io/projected/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-kube-api-access-49vjw\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.996593 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:34 crc kubenswrapper[4781]: I1208 20:24:34.996602 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.013751 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" (UID: "f65511c0-39b0-41f6-8f01-5e1d12aa5c9a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.019129 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" (UID: "f65511c0-39b0-41f6-8f01-5e1d12aa5c9a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.020365 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.098907 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.099341 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.264783 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-75dd9f77c4-85lhw" Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.328880 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66bd65f4cd-rdlc7"] Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.348612 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" event={"ID":"f65511c0-39b0-41f6-8f01-5e1d12aa5c9a","Type":"ContainerDied","Data":"ae0e8ed4e5cae824a29a9c50f7837d8a609730018a9435e89ed637ab8f774348"} Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.348661 4781 scope.go:117] "RemoveContainer" containerID="6f09bcd45ea314fbb421767e43b183648dd8bf31e41ffae61eb661b526d1b414" Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.348785 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849ff95dc5-pmqdz" Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.361315 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66bd65f4cd-rdlc7" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerName="horizon-log" containerID="cri-o://458aad91b280fae6c403fbd40963945d0173a54e50791aa99b5456b5962d7f8c" gracePeriod=30 Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.361764 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-66bd65f4cd-rdlc7" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerName="horizon" containerID="cri-o://02389c5659f9a03f3983d67ab739b10de654ad7c39804e031b67fa0e1b85821a" gracePeriod=30 Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.362718 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea","Type":"ContainerStarted","Data":"c0d2c47f2bbffb5189b30ef17720899d7dadd7241f4877b72b0074537c85963c"} Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.362757 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.362961 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ec3cd201-8baf-495f-aef9-91043bdfb8cb" containerName="cinder-scheduler" containerID="cri-o://ba7bb728b5a05e6aa70858a1a806699c5cf36c1252e84dd82d3b6399e9e54da3" gracePeriod=30 Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.363072 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ec3cd201-8baf-495f-aef9-91043bdfb8cb" containerName="probe" containerID="cri-o://4f849e3508d2f57774312883cc52d561f8901c9a4d4a6e24e5628850d08abb43" gracePeriod=30 Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.388446 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-pmqdz"] Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.389316 4781 scope.go:117] "RemoveContainer" containerID="0c7e062cfccabee5d125366983c35cdea59eeff9f35f8f3ee8de058481f5b85a" Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.395530 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-pmqdz"] Dec 08 20:24:35 crc kubenswrapper[4781]: I1208 20:24:35.418343 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.797207887 podStartE2EDuration="6.418327133s" podCreationTimestamp="2025-12-08 20:24:29 +0000 UTC" firstStartedPulling="2025-12-08 20:24:30.472160394 +0000 UTC m=+1186.623443771" lastFinishedPulling="2025-12-08 20:24:34.09327964 +0000 UTC m=+1190.244563017" observedRunningTime="2025-12-08 20:24:35.412527426 +0000 UTC m=+1191.563810803" watchObservedRunningTime="2025-12-08 20:24:35.418327133 +0000 UTC m=+1191.569610510" Dec 08 20:24:36 crc kubenswrapper[4781]: I1208 20:24:36.069567 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:36 crc kubenswrapper[4781]: I1208 20:24:36.079481 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:36 crc kubenswrapper[4781]: I1208 20:24:36.139241 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" path="/var/lib/kubelet/pods/f65511c0-39b0-41f6-8f01-5e1d12aa5c9a/volumes" Dec 08 20:24:36 crc kubenswrapper[4781]: I1208 20:24:36.371959 4781 generic.go:334] "Generic (PLEG): container finished" podID="ec3cd201-8baf-495f-aef9-91043bdfb8cb" containerID="4f849e3508d2f57774312883cc52d561f8901c9a4d4a6e24e5628850d08abb43" exitCode=0 Dec 08 20:24:36 crc kubenswrapper[4781]: I1208 20:24:36.372032 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec3cd201-8baf-495f-aef9-91043bdfb8cb","Type":"ContainerDied","Data":"4f849e3508d2f57774312883cc52d561f8901c9a4d4a6e24e5628850d08abb43"} Dec 08 20:24:37 crc kubenswrapper[4781]: I1208 20:24:37.177401 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:37 crc kubenswrapper[4781]: I1208 20:24:37.245786 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7cb6b5bcfd-8jc9n" Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.395291 4781 generic.go:334] "Generic (PLEG): container finished" podID="ec3cd201-8baf-495f-aef9-91043bdfb8cb" containerID="ba7bb728b5a05e6aa70858a1a806699c5cf36c1252e84dd82d3b6399e9e54da3" exitCode=0 Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.395790 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec3cd201-8baf-495f-aef9-91043bdfb8cb","Type":"ContainerDied","Data":"ba7bb728b5a05e6aa70858a1a806699c5cf36c1252e84dd82d3b6399e9e54da3"} Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.648298 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.801370 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-scripts\") pod \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.801793 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-config-data-custom\") pod \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.801826 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec3cd201-8baf-495f-aef9-91043bdfb8cb-etc-machine-id\") pod \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.801905 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-config-data\") pod \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.801970 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec3cd201-8baf-495f-aef9-91043bdfb8cb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ec3cd201-8baf-495f-aef9-91043bdfb8cb" (UID: "ec3cd201-8baf-495f-aef9-91043bdfb8cb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.802093 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-combined-ca-bundle\") pod \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.802158 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knh2j\" (UniqueName: \"kubernetes.io/projected/ec3cd201-8baf-495f-aef9-91043bdfb8cb-kube-api-access-knh2j\") pod \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\" (UID: \"ec3cd201-8baf-495f-aef9-91043bdfb8cb\") " Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.802610 4781 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec3cd201-8baf-495f-aef9-91043bdfb8cb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.813196 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-scripts" (OuterVolumeSpecName: "scripts") pod "ec3cd201-8baf-495f-aef9-91043bdfb8cb" (UID: "ec3cd201-8baf-495f-aef9-91043bdfb8cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.813256 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ec3cd201-8baf-495f-aef9-91043bdfb8cb" (UID: "ec3cd201-8baf-495f-aef9-91043bdfb8cb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.815150 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec3cd201-8baf-495f-aef9-91043bdfb8cb-kube-api-access-knh2j" (OuterVolumeSpecName: "kube-api-access-knh2j") pod "ec3cd201-8baf-495f-aef9-91043bdfb8cb" (UID: "ec3cd201-8baf-495f-aef9-91043bdfb8cb"). InnerVolumeSpecName "kube-api-access-knh2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.862441 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec3cd201-8baf-495f-aef9-91043bdfb8cb" (UID: "ec3cd201-8baf-495f-aef9-91043bdfb8cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.904574 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.904613 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knh2j\" (UniqueName: \"kubernetes.io/projected/ec3cd201-8baf-495f-aef9-91043bdfb8cb-kube-api-access-knh2j\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.904627 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.904638 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:38 crc kubenswrapper[4781]: I1208 20:24:38.936221 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-config-data" (OuterVolumeSpecName: "config-data") pod "ec3cd201-8baf-495f-aef9-91043bdfb8cb" (UID: "ec3cd201-8baf-495f-aef9-91043bdfb8cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.006027 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec3cd201-8baf-495f-aef9-91043bdfb8cb-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.406007 4781 generic.go:334] "Generic (PLEG): container finished" podID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerID="02389c5659f9a03f3983d67ab739b10de654ad7c39804e031b67fa0e1b85821a" exitCode=0 Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.406084 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66bd65f4cd-rdlc7" event={"ID":"2b714ffd-7c31-434d-a833-04abe6c8dcfb","Type":"ContainerDied","Data":"02389c5659f9a03f3983d67ab739b10de654ad7c39804e031b67fa0e1b85821a"} Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.408013 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ec3cd201-8baf-495f-aef9-91043bdfb8cb","Type":"ContainerDied","Data":"93b85babd040f8fda4618fe986a7283b71eabe156e0d9bbc829f9042f005c2f1"} Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.408054 4781 scope.go:117] "RemoveContainer" containerID="4f849e3508d2f57774312883cc52d561f8901c9a4d4a6e24e5628850d08abb43" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.408065 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.464628 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.465426 4781 scope.go:117] "RemoveContainer" containerID="ba7bb728b5a05e6aa70858a1a806699c5cf36c1252e84dd82d3b6399e9e54da3" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.481027 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.506350 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 20:24:39 crc kubenswrapper[4781]: E1208 20:24:39.506850 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c055aae-fa31-4d84-aa1b-b60c8829a61b" containerName="neutron-httpd" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.506869 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c055aae-fa31-4d84-aa1b-b60c8829a61b" containerName="neutron-httpd" Dec 08 20:24:39 crc kubenswrapper[4781]: E1208 20:24:39.506890 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" containerName="init" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.506898 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" containerName="init" Dec 08 20:24:39 crc kubenswrapper[4781]: E1208 20:24:39.506927 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec3cd201-8baf-495f-aef9-91043bdfb8cb" containerName="cinder-scheduler" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.506955 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3cd201-8baf-495f-aef9-91043bdfb8cb" containerName="cinder-scheduler" Dec 08 20:24:39 crc kubenswrapper[4781]: E1208 20:24:39.506977 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" containerName="dnsmasq-dns" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.506985 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" containerName="dnsmasq-dns" Dec 08 20:24:39 crc kubenswrapper[4781]: E1208 20:24:39.507001 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c055aae-fa31-4d84-aa1b-b60c8829a61b" containerName="neutron-api" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.507010 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c055aae-fa31-4d84-aa1b-b60c8829a61b" containerName="neutron-api" Dec 08 20:24:39 crc kubenswrapper[4781]: E1208 20:24:39.507027 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec3cd201-8baf-495f-aef9-91043bdfb8cb" containerName="probe" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.507034 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3cd201-8baf-495f-aef9-91043bdfb8cb" containerName="probe" Dec 08 20:24:39 crc kubenswrapper[4781]: E1208 20:24:39.507052 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="965ff155-f5c6-4c49-9d46-18a62ef94308" containerName="horizon" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.507061 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="965ff155-f5c6-4c49-9d46-18a62ef94308" containerName="horizon" Dec 08 20:24:39 crc kubenswrapper[4781]: E1208 20:24:39.507079 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="965ff155-f5c6-4c49-9d46-18a62ef94308" containerName="horizon-log" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.507086 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="965ff155-f5c6-4c49-9d46-18a62ef94308" containerName="horizon-log" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.507300 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c055aae-fa31-4d84-aa1b-b60c8829a61b" containerName="neutron-httpd" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.507316 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c055aae-fa31-4d84-aa1b-b60c8829a61b" containerName="neutron-api" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.507332 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec3cd201-8baf-495f-aef9-91043bdfb8cb" containerName="probe" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.507353 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="965ff155-f5c6-4c49-9d46-18a62ef94308" containerName="horizon-log" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.507366 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="965ff155-f5c6-4c49-9d46-18a62ef94308" containerName="horizon" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.507375 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec3cd201-8baf-495f-aef9-91043bdfb8cb" containerName="cinder-scheduler" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.507384 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f65511c0-39b0-41f6-8f01-5e1d12aa5c9a" containerName="dnsmasq-dns" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.508565 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.511465 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.522169 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.615546 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-config-data\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.615595 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kngw9\" (UniqueName: \"kubernetes.io/projected/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-kube-api-access-kngw9\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.615708 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.615811 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-scripts\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.615984 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.616031 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.717550 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-scripts\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.717614 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.717632 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.717666 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-config-data\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.717683 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kngw9\" (UniqueName: \"kubernetes.io/projected/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-kube-api-access-kngw9\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.717724 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.717736 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.722691 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.723235 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-config-data\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.723699 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-scripts\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.732484 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.743531 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kngw9\" (UniqueName: \"kubernetes.io/projected/ce9b542d-77aa-4a8d-95fe-393a5a0dafa2-kube-api-access-kngw9\") pod \"cinder-scheduler-0\" (UID: \"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2\") " pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.844415 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.926355 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66bd65f4cd-rdlc7" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 08 20:24:39 crc kubenswrapper[4781]: I1208 20:24:39.986086 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-796ff6864f-m6jf6" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.373956 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec3cd201-8baf-495f-aef9-91043bdfb8cb" path="/var/lib/kubelet/pods/ec3cd201-8baf-495f-aef9-91043bdfb8cb/volumes" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.458823 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.460157 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.466339 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.466686 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.467043 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7g79w" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.476192 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.734884 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.741447 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee967571-7083-4ff7-a035-b90fadf420ee-openstack-config-secret\") pod \"openstackclient\" (UID: \"ee967571-7083-4ff7-a035-b90fadf420ee\") " pod="openstack/openstackclient" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.742136 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee967571-7083-4ff7-a035-b90fadf420ee-openstack-config\") pod \"openstackclient\" (UID: \"ee967571-7083-4ff7-a035-b90fadf420ee\") " pod="openstack/openstackclient" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.742171 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee967571-7083-4ff7-a035-b90fadf420ee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ee967571-7083-4ff7-a035-b90fadf420ee\") " pod="openstack/openstackclient" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.742226 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg65k\" (UniqueName: \"kubernetes.io/projected/ee967571-7083-4ff7-a035-b90fadf420ee-kube-api-access-dg65k\") pod \"openstackclient\" (UID: \"ee967571-7083-4ff7-a035-b90fadf420ee\") " pod="openstack/openstackclient" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.844489 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee967571-7083-4ff7-a035-b90fadf420ee-openstack-config-secret\") pod \"openstackclient\" (UID: \"ee967571-7083-4ff7-a035-b90fadf420ee\") " pod="openstack/openstackclient" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.844593 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee967571-7083-4ff7-a035-b90fadf420ee-openstack-config\") pod \"openstackclient\" (UID: \"ee967571-7083-4ff7-a035-b90fadf420ee\") " pod="openstack/openstackclient" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.844631 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee967571-7083-4ff7-a035-b90fadf420ee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ee967571-7083-4ff7-a035-b90fadf420ee\") " pod="openstack/openstackclient" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.844715 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg65k\" (UniqueName: \"kubernetes.io/projected/ee967571-7083-4ff7-a035-b90fadf420ee-kube-api-access-dg65k\") pod \"openstackclient\" (UID: \"ee967571-7083-4ff7-a035-b90fadf420ee\") " pod="openstack/openstackclient" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.845754 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ee967571-7083-4ff7-a035-b90fadf420ee-openstack-config\") pod \"openstackclient\" (UID: \"ee967571-7083-4ff7-a035-b90fadf420ee\") " pod="openstack/openstackclient" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.852148 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee967571-7083-4ff7-a035-b90fadf420ee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ee967571-7083-4ff7-a035-b90fadf420ee\") " pod="openstack/openstackclient" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.853611 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ee967571-7083-4ff7-a035-b90fadf420ee-openstack-config-secret\") pod \"openstackclient\" (UID: \"ee967571-7083-4ff7-a035-b90fadf420ee\") " pod="openstack/openstackclient" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.867560 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg65k\" (UniqueName: \"kubernetes.io/projected/ee967571-7083-4ff7-a035-b90fadf420ee-kube-api-access-dg65k\") pod \"openstackclient\" (UID: \"ee967571-7083-4ff7-a035-b90fadf420ee\") " pod="openstack/openstackclient" Dec 08 20:24:40 crc kubenswrapper[4781]: I1208 20:24:40.986796 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 08 20:24:41 crc kubenswrapper[4781]: I1208 20:24:41.469470 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2","Type":"ContainerStarted","Data":"f117546abf4ecbbdd60ef7087df7b531d0fa7bd552b7e71d41cf7a1efce1e3a6"} Dec 08 20:24:41 crc kubenswrapper[4781]: I1208 20:24:41.504259 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 08 20:24:41 crc kubenswrapper[4781]: W1208 20:24:41.507094 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee967571_7083_4ff7_a035_b90fadf420ee.slice/crio-2ab275fafa41ca1695b4ab68d63260493b966674314f3aadd874dfcf6fb0b735 WatchSource:0}: Error finding container 2ab275fafa41ca1695b4ab68d63260493b966674314f3aadd874dfcf6fb0b735: Status 404 returned error can't find the container with id 2ab275fafa41ca1695b4ab68d63260493b966674314f3aadd874dfcf6fb0b735 Dec 08 20:24:41 crc kubenswrapper[4781]: I1208 20:24:41.776775 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 08 20:24:42 crc kubenswrapper[4781]: I1208 20:24:42.490087 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2","Type":"ContainerStarted","Data":"0c7446c0d9b01e1b87a847001425055c692c2ebbafdbbfbfc74d51d4e3d49598"} Dec 08 20:24:42 crc kubenswrapper[4781]: I1208 20:24:42.490418 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ce9b542d-77aa-4a8d-95fe-393a5a0dafa2","Type":"ContainerStarted","Data":"b5db863108b3d38a18795546bc213351ef9f91ad759a5bfc1cea7881063459d4"} Dec 08 20:24:42 crc kubenswrapper[4781]: I1208 20:24:42.519415 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ee967571-7083-4ff7-a035-b90fadf420ee","Type":"ContainerStarted","Data":"2ab275fafa41ca1695b4ab68d63260493b966674314f3aadd874dfcf6fb0b735"} Dec 08 20:24:42 crc kubenswrapper[4781]: I1208 20:24:42.553644 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.553614982 podStartE2EDuration="3.553614982s" podCreationTimestamp="2025-12-08 20:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:42.529021895 +0000 UTC m=+1198.680305292" watchObservedRunningTime="2025-12-08 20:24:42.553614982 +0000 UTC m=+1198.704898359" Dec 08 20:24:42 crc kubenswrapper[4781]: I1208 20:24:42.580663 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:42 crc kubenswrapper[4781]: I1208 20:24:42.879753 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56858f8966-hkmxw" Dec 08 20:24:42 crc kubenswrapper[4781]: I1208 20:24:42.945715 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55d57ffb96-9nv59"] Dec 08 20:24:42 crc kubenswrapper[4781]: I1208 20:24:42.946004 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55d57ffb96-9nv59" podUID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerName="barbican-api-log" containerID="cri-o://2ce8da86a5c34c4bde70c84dab43dd46393f6d7dd4e26d5c08d6b8210f48d74b" gracePeriod=30 Dec 08 20:24:42 crc kubenswrapper[4781]: I1208 20:24:42.946209 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-55d57ffb96-9nv59" podUID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerName="barbican-api" containerID="cri-o://625ea809d498c0e401b8086b5ac3d0a4cdf7662262e4daa59b098426c120d4e1" gracePeriod=30 Dec 08 20:24:43 crc kubenswrapper[4781]: I1208 20:24:43.540116 4781 generic.go:334] "Generic (PLEG): container finished" podID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerID="2ce8da86a5c34c4bde70c84dab43dd46393f6d7dd4e26d5c08d6b8210f48d74b" exitCode=143 Dec 08 20:24:43 crc kubenswrapper[4781]: I1208 20:24:43.541801 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55d57ffb96-9nv59" event={"ID":"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3","Type":"ContainerDied","Data":"2ce8da86a5c34c4bde70c84dab43dd46393f6d7dd4e26d5c08d6b8210f48d74b"} Dec 08 20:24:44 crc kubenswrapper[4781]: I1208 20:24:44.845199 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.267639 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.269154 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="sg-core" containerID="cri-o://54127ec91fe42d9fab05a73ded10541e3b21826332319330b24cca9b11c51a34" gracePeriod=30 Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.269183 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="proxy-httpd" containerID="cri-o://c0d2c47f2bbffb5189b30ef17720899d7dadd7241f4877b72b0074537c85963c" gracePeriod=30 Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.269183 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="ceilometer-notification-agent" containerID="cri-o://23541fb6775a7a3e1833ef55a404b04d07f4c2a4e1c2f965fc435a61169a8950" gracePeriod=30 Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.269582 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="ceilometer-central-agent" containerID="cri-o://4532f922a9104106b8fcfaab2f400ba80870604c911a442fb45360ed916ada7e" gracePeriod=30 Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.278425 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.401404 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55d57ffb96-9nv59" podUID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:49082->10.217.0.160:9311: read: connection reset by peer" Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.401805 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55d57ffb96-9nv59" podUID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:49094->10.217.0.160:9311: read: connection reset by peer" Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.575121 4781 generic.go:334] "Generic (PLEG): container finished" podID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerID="625ea809d498c0e401b8086b5ac3d0a4cdf7662262e4daa59b098426c120d4e1" exitCode=0 Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.575187 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55d57ffb96-9nv59" event={"ID":"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3","Type":"ContainerDied","Data":"625ea809d498c0e401b8086b5ac3d0a4cdf7662262e4daa59b098426c120d4e1"} Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.578135 4781 generic.go:334] "Generic (PLEG): container finished" podID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerID="c0d2c47f2bbffb5189b30ef17720899d7dadd7241f4877b72b0074537c85963c" exitCode=0 Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.578152 4781 generic.go:334] "Generic (PLEG): container finished" podID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerID="54127ec91fe42d9fab05a73ded10541e3b21826332319330b24cca9b11c51a34" exitCode=2 Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.578165 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea","Type":"ContainerDied","Data":"c0d2c47f2bbffb5189b30ef17720899d7dadd7241f4877b72b0074537c85963c"} Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.578179 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea","Type":"ContainerDied","Data":"54127ec91fe42d9fab05a73ded10541e3b21826332319330b24cca9b11c51a34"} Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.918627 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7dc689b5b9-6r5rd"] Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.922335 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.928543 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.928757 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.929823 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 08 20:24:46 crc kubenswrapper[4781]: I1208 20:24:46.943071 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7dc689b5b9-6r5rd"] Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.080063 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-log-httpd\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.080134 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-public-tls-certs\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.080318 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-internal-tls-certs\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.080385 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-config-data\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.080461 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-combined-ca-bundle\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.080597 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-etc-swift\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.080650 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-run-httpd\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.080720 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2nk6\" (UniqueName: \"kubernetes.io/projected/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-kube-api-access-g2nk6\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.182597 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2nk6\" (UniqueName: \"kubernetes.io/projected/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-kube-api-access-g2nk6\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.182693 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-log-httpd\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.182725 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-public-tls-certs\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.182784 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-internal-tls-certs\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.182811 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-config-data\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.182865 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-combined-ca-bundle\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.182949 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-etc-swift\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.182976 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-run-httpd\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.183577 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-run-httpd\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.184721 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-log-httpd\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.190009 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-public-tls-certs\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.190076 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-internal-tls-certs\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.190167 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-config-data\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.202323 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-etc-swift\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.203675 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2nk6\" (UniqueName: \"kubernetes.io/projected/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-kube-api-access-g2nk6\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.205429 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6925de4-2c7d-43cb-b2a9-1ec66f56b007-combined-ca-bundle\") pod \"swift-proxy-7dc689b5b9-6r5rd\" (UID: \"d6925de4-2c7d-43cb-b2a9-1ec66f56b007\") " pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.239721 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.588276 4781 generic.go:334] "Generic (PLEG): container finished" podID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerID="4532f922a9104106b8fcfaab2f400ba80870604c911a442fb45360ed916ada7e" exitCode=0 Dec 08 20:24:47 crc kubenswrapper[4781]: I1208 20:24:47.588570 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea","Type":"ContainerDied","Data":"4532f922a9104106b8fcfaab2f400ba80870604c911a442fb45360ed916ada7e"} Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.469904 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-dplq8"] Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.471316 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dplq8" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.495762 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-dplq8"] Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.578012 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-dvfwv"] Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.579646 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dvfwv" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.595398 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-dae9-account-create-update-jrqz4"] Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.596480 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dae9-account-create-update-jrqz4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.598857 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.615547 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dvfwv"] Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.625471 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-dae9-account-create-update-jrqz4"] Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.632074 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4532eac0-b6a8-4560-bb45-ba9b78cc4eb4-operator-scripts\") pod \"nova-api-db-create-dplq8\" (UID: \"4532eac0-b6a8-4560-bb45-ba9b78cc4eb4\") " pod="openstack/nova-api-db-create-dplq8" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.632173 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqxnb\" (UniqueName: \"kubernetes.io/projected/4532eac0-b6a8-4560-bb45-ba9b78cc4eb4-kube-api-access-lqxnb\") pod \"nova-api-db-create-dplq8\" (UID: \"4532eac0-b6a8-4560-bb45-ba9b78cc4eb4\") " pod="openstack/nova-api-db-create-dplq8" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.665122 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-xlpd4"] Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.666316 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xlpd4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.675109 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xlpd4"] Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.740759 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4532eac0-b6a8-4560-bb45-ba9b78cc4eb4-operator-scripts\") pod \"nova-api-db-create-dplq8\" (UID: \"4532eac0-b6a8-4560-bb45-ba9b78cc4eb4\") " pod="openstack/nova-api-db-create-dplq8" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.740840 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9bdm\" (UniqueName: \"kubernetes.io/projected/0ada136d-baf8-411b-aa50-66edaf44a52b-kube-api-access-x9bdm\") pod \"nova-cell1-db-create-xlpd4\" (UID: \"0ada136d-baf8-411b-aa50-66edaf44a52b\") " pod="openstack/nova-cell1-db-create-xlpd4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.741069 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ada136d-baf8-411b-aa50-66edaf44a52b-operator-scripts\") pod \"nova-cell1-db-create-xlpd4\" (UID: \"0ada136d-baf8-411b-aa50-66edaf44a52b\") " pod="openstack/nova-cell1-db-create-xlpd4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.741131 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqxnb\" (UniqueName: \"kubernetes.io/projected/4532eac0-b6a8-4560-bb45-ba9b78cc4eb4-kube-api-access-lqxnb\") pod \"nova-api-db-create-dplq8\" (UID: \"4532eac0-b6a8-4560-bb45-ba9b78cc4eb4\") " pod="openstack/nova-api-db-create-dplq8" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.741170 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63220ecf-e9cb-45c3-b7a6-ac8a631cb22c-operator-scripts\") pod \"nova-cell0-db-create-dvfwv\" (UID: \"63220ecf-e9cb-45c3-b7a6-ac8a631cb22c\") " pod="openstack/nova-cell0-db-create-dvfwv" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.741262 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afd06929-88eb-4520-a1e9-97c32d3c2223-operator-scripts\") pod \"nova-api-dae9-account-create-update-jrqz4\" (UID: \"afd06929-88eb-4520-a1e9-97c32d3c2223\") " pod="openstack/nova-api-dae9-account-create-update-jrqz4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.741322 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zvqp\" (UniqueName: \"kubernetes.io/projected/afd06929-88eb-4520-a1e9-97c32d3c2223-kube-api-access-4zvqp\") pod \"nova-api-dae9-account-create-update-jrqz4\" (UID: \"afd06929-88eb-4520-a1e9-97c32d3c2223\") " pod="openstack/nova-api-dae9-account-create-update-jrqz4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.741355 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8wd6\" (UniqueName: \"kubernetes.io/projected/63220ecf-e9cb-45c3-b7a6-ac8a631cb22c-kube-api-access-q8wd6\") pod \"nova-cell0-db-create-dvfwv\" (UID: \"63220ecf-e9cb-45c3-b7a6-ac8a631cb22c\") " pod="openstack/nova-cell0-db-create-dvfwv" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.742343 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4532eac0-b6a8-4560-bb45-ba9b78cc4eb4-operator-scripts\") pod \"nova-api-db-create-dplq8\" (UID: \"4532eac0-b6a8-4560-bb45-ba9b78cc4eb4\") " pod="openstack/nova-api-db-create-dplq8" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.773874 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqxnb\" (UniqueName: \"kubernetes.io/projected/4532eac0-b6a8-4560-bb45-ba9b78cc4eb4-kube-api-access-lqxnb\") pod \"nova-api-db-create-dplq8\" (UID: \"4532eac0-b6a8-4560-bb45-ba9b78cc4eb4\") " pod="openstack/nova-api-db-create-dplq8" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.784651 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-057f-account-create-update-58gjv"] Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.787907 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-057f-account-create-update-58gjv" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.794229 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.805712 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dplq8" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.808853 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-057f-account-create-update-58gjv"] Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.843119 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9bdm\" (UniqueName: \"kubernetes.io/projected/0ada136d-baf8-411b-aa50-66edaf44a52b-kube-api-access-x9bdm\") pod \"nova-cell1-db-create-xlpd4\" (UID: \"0ada136d-baf8-411b-aa50-66edaf44a52b\") " pod="openstack/nova-cell1-db-create-xlpd4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.843211 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ada136d-baf8-411b-aa50-66edaf44a52b-operator-scripts\") pod \"nova-cell1-db-create-xlpd4\" (UID: \"0ada136d-baf8-411b-aa50-66edaf44a52b\") " pod="openstack/nova-cell1-db-create-xlpd4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.843258 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63220ecf-e9cb-45c3-b7a6-ac8a631cb22c-operator-scripts\") pod \"nova-cell0-db-create-dvfwv\" (UID: \"63220ecf-e9cb-45c3-b7a6-ac8a631cb22c\") " pod="openstack/nova-cell0-db-create-dvfwv" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.843284 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afd06929-88eb-4520-a1e9-97c32d3c2223-operator-scripts\") pod \"nova-api-dae9-account-create-update-jrqz4\" (UID: \"afd06929-88eb-4520-a1e9-97c32d3c2223\") " pod="openstack/nova-api-dae9-account-create-update-jrqz4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.843307 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zvqp\" (UniqueName: \"kubernetes.io/projected/afd06929-88eb-4520-a1e9-97c32d3c2223-kube-api-access-4zvqp\") pod \"nova-api-dae9-account-create-update-jrqz4\" (UID: \"afd06929-88eb-4520-a1e9-97c32d3c2223\") " pod="openstack/nova-api-dae9-account-create-update-jrqz4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.843325 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8wd6\" (UniqueName: \"kubernetes.io/projected/63220ecf-e9cb-45c3-b7a6-ac8a631cb22c-kube-api-access-q8wd6\") pod \"nova-cell0-db-create-dvfwv\" (UID: \"63220ecf-e9cb-45c3-b7a6-ac8a631cb22c\") " pod="openstack/nova-cell0-db-create-dvfwv" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.844324 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63220ecf-e9cb-45c3-b7a6-ac8a631cb22c-operator-scripts\") pod \"nova-cell0-db-create-dvfwv\" (UID: \"63220ecf-e9cb-45c3-b7a6-ac8a631cb22c\") " pod="openstack/nova-cell0-db-create-dvfwv" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.846653 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afd06929-88eb-4520-a1e9-97c32d3c2223-operator-scripts\") pod \"nova-api-dae9-account-create-update-jrqz4\" (UID: \"afd06929-88eb-4520-a1e9-97c32d3c2223\") " pod="openstack/nova-api-dae9-account-create-update-jrqz4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.862060 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ada136d-baf8-411b-aa50-66edaf44a52b-operator-scripts\") pod \"nova-cell1-db-create-xlpd4\" (UID: \"0ada136d-baf8-411b-aa50-66edaf44a52b\") " pod="openstack/nova-cell1-db-create-xlpd4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.872361 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9bdm\" (UniqueName: \"kubernetes.io/projected/0ada136d-baf8-411b-aa50-66edaf44a52b-kube-api-access-x9bdm\") pod \"nova-cell1-db-create-xlpd4\" (UID: \"0ada136d-baf8-411b-aa50-66edaf44a52b\") " pod="openstack/nova-cell1-db-create-xlpd4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.872470 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8wd6\" (UniqueName: \"kubernetes.io/projected/63220ecf-e9cb-45c3-b7a6-ac8a631cb22c-kube-api-access-q8wd6\") pod \"nova-cell0-db-create-dvfwv\" (UID: \"63220ecf-e9cb-45c3-b7a6-ac8a631cb22c\") " pod="openstack/nova-cell0-db-create-dvfwv" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.872948 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zvqp\" (UniqueName: \"kubernetes.io/projected/afd06929-88eb-4520-a1e9-97c32d3c2223-kube-api-access-4zvqp\") pod \"nova-api-dae9-account-create-update-jrqz4\" (UID: \"afd06929-88eb-4520-a1e9-97c32d3c2223\") " pod="openstack/nova-api-dae9-account-create-update-jrqz4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.906083 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dvfwv" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.924332 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dae9-account-create-update-jrqz4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.926874 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66bd65f4cd-rdlc7" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.944770 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12f129eb-3261-41d7-99e4-a6de53a7fd31-operator-scripts\") pod \"nova-cell0-057f-account-create-update-58gjv\" (UID: \"12f129eb-3261-41d7-99e4-a6de53a7fd31\") " pod="openstack/nova-cell0-057f-account-create-update-58gjv" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.945159 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66fks\" (UniqueName: \"kubernetes.io/projected/12f129eb-3261-41d7-99e4-a6de53a7fd31-kube-api-access-66fks\") pod \"nova-cell0-057f-account-create-update-58gjv\" (UID: \"12f129eb-3261-41d7-99e4-a6de53a7fd31\") " pod="openstack/nova-cell0-057f-account-create-update-58gjv" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.988068 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xlpd4" Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.990983 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9c42-account-create-update-4r8td"] Dec 08 20:24:49 crc kubenswrapper[4781]: I1208 20:24:49.993249 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9c42-account-create-update-4r8td" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.001458 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.025861 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9c42-account-create-update-4r8td"] Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.047860 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12f129eb-3261-41d7-99e4-a6de53a7fd31-operator-scripts\") pod \"nova-cell0-057f-account-create-update-58gjv\" (UID: \"12f129eb-3261-41d7-99e4-a6de53a7fd31\") " pod="openstack/nova-cell0-057f-account-create-update-58gjv" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.048000 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66fks\" (UniqueName: \"kubernetes.io/projected/12f129eb-3261-41d7-99e4-a6de53a7fd31-kube-api-access-66fks\") pod \"nova-cell0-057f-account-create-update-58gjv\" (UID: \"12f129eb-3261-41d7-99e4-a6de53a7fd31\") " pod="openstack/nova-cell0-057f-account-create-update-58gjv" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.048837 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12f129eb-3261-41d7-99e4-a6de53a7fd31-operator-scripts\") pod \"nova-cell0-057f-account-create-update-58gjv\" (UID: \"12f129eb-3261-41d7-99e4-a6de53a7fd31\") " pod="openstack/nova-cell0-057f-account-create-update-58gjv" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.069310 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66fks\" (UniqueName: \"kubernetes.io/projected/12f129eb-3261-41d7-99e4-a6de53a7fd31-kube-api-access-66fks\") pod \"nova-cell0-057f-account-create-update-58gjv\" (UID: \"12f129eb-3261-41d7-99e4-a6de53a7fd31\") " pod="openstack/nova-cell0-057f-account-create-update-58gjv" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.143313 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.149076 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmvkn\" (UniqueName: \"kubernetes.io/projected/2627247a-1d1b-4959-8b8f-e8750950b5ec-kube-api-access-zmvkn\") pod \"nova-cell1-9c42-account-create-update-4r8td\" (UID: \"2627247a-1d1b-4959-8b8f-e8750950b5ec\") " pod="openstack/nova-cell1-9c42-account-create-update-4r8td" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.149213 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2627247a-1d1b-4959-8b8f-e8750950b5ec-operator-scripts\") pod \"nova-cell1-9c42-account-create-update-4r8td\" (UID: \"2627247a-1d1b-4959-8b8f-e8750950b5ec\") " pod="openstack/nova-cell1-9c42-account-create-update-4r8td" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.166017 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-057f-account-create-update-58gjv" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.251023 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmvkn\" (UniqueName: \"kubernetes.io/projected/2627247a-1d1b-4959-8b8f-e8750950b5ec-kube-api-access-zmvkn\") pod \"nova-cell1-9c42-account-create-update-4r8td\" (UID: \"2627247a-1d1b-4959-8b8f-e8750950b5ec\") " pod="openstack/nova-cell1-9c42-account-create-update-4r8td" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.251171 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2627247a-1d1b-4959-8b8f-e8750950b5ec-operator-scripts\") pod \"nova-cell1-9c42-account-create-update-4r8td\" (UID: \"2627247a-1d1b-4959-8b8f-e8750950b5ec\") " pod="openstack/nova-cell1-9c42-account-create-update-4r8td" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.252794 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2627247a-1d1b-4959-8b8f-e8750950b5ec-operator-scripts\") pod \"nova-cell1-9c42-account-create-update-4r8td\" (UID: \"2627247a-1d1b-4959-8b8f-e8750950b5ec\") " pod="openstack/nova-cell1-9c42-account-create-update-4r8td" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.271287 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmvkn\" (UniqueName: \"kubernetes.io/projected/2627247a-1d1b-4959-8b8f-e8750950b5ec-kube-api-access-zmvkn\") pod \"nova-cell1-9c42-account-create-update-4r8td\" (UID: \"2627247a-1d1b-4959-8b8f-e8750950b5ec\") " pod="openstack/nova-cell1-9c42-account-create-update-4r8td" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.321143 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9c42-account-create-update-4r8td" Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.626715 4781 generic.go:334] "Generic (PLEG): container finished" podID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerID="23541fb6775a7a3e1833ef55a404b04d07f4c2a4e1c2f965fc435a61169a8950" exitCode=0 Dec 08 20:24:50 crc kubenswrapper[4781]: I1208 20:24:50.626767 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea","Type":"ContainerDied","Data":"23541fb6775a7a3e1833ef55a404b04d07f4c2a4e1c2f965fc435a61169a8950"} Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.184426 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.287482 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-config-data\") pod \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.287860 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrfhg\" (UniqueName: \"kubernetes.io/projected/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-kube-api-access-hrfhg\") pod \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.287993 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-config-data-custom\") pod \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.288039 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-combined-ca-bundle\") pod \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.288224 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-logs\") pod \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.289434 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-logs" (OuterVolumeSpecName: "logs") pod "73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" (UID: "73418976-c6e9-4ba9-bbd6-145ffa3d5ec3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.293405 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" (UID: "73418976-c6e9-4ba9-bbd6-145ffa3d5ec3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.293819 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-kube-api-access-hrfhg" (OuterVolumeSpecName: "kube-api-access-hrfhg") pod "73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" (UID: "73418976-c6e9-4ba9-bbd6-145ffa3d5ec3"). InnerVolumeSpecName "kube-api-access-hrfhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.390058 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" (UID: "73418976-c6e9-4ba9-bbd6-145ffa3d5ec3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.390206 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-combined-ca-bundle\") pod \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\" (UID: \"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3\") " Dec 08 20:24:52 crc kubenswrapper[4781]: W1208 20:24:52.390294 4781 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3/volumes/kubernetes.io~secret/combined-ca-bundle Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.390308 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" (UID: "73418976-c6e9-4ba9-bbd6-145ffa3d5ec3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.390761 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrfhg\" (UniqueName: \"kubernetes.io/projected/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-kube-api-access-hrfhg\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.390779 4781 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.390791 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.390801 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.471255 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-config-data" (OuterVolumeSpecName: "config-data") pod "73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" (UID: "73418976-c6e9-4ba9-bbd6-145ffa3d5ec3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.494264 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.576963 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.662630 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea","Type":"ContainerDied","Data":"95aedae02f2eed831c0d0652f932670ae75eab4c8a22691951caeb65990441d9"} Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.662697 4781 scope.go:117] "RemoveContainer" containerID="c0d2c47f2bbffb5189b30ef17720899d7dadd7241f4877b72b0074537c85963c" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.662793 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.672540 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-55d57ffb96-9nv59" event={"ID":"73418976-c6e9-4ba9-bbd6-145ffa3d5ec3","Type":"ContainerDied","Data":"cef4a7b5ddd92c6dfbf933ba7eb649f6359a16ae961c0a953f3cca26ac9cb126"} Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.672651 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-55d57ffb96-9nv59" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.682380 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ee967571-7083-4ff7-a035-b90fadf420ee","Type":"ContainerStarted","Data":"f199cc4ae8626a0d964086262c412d2291e4b7ff8018ae0bee657198cd5532af"} Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.699750 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-config-data\") pod \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.699807 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc4zf\" (UniqueName: \"kubernetes.io/projected/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-kube-api-access-gc4zf\") pod \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.699834 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-scripts\") pod \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.699986 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-run-httpd\") pod \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.700040 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-log-httpd\") pod \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.700060 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-sg-core-conf-yaml\") pod \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.700178 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-combined-ca-bundle\") pod \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\" (UID: \"1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea\") " Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.706261 4781 scope.go:117] "RemoveContainer" containerID="54127ec91fe42d9fab05a73ded10541e3b21826332319330b24cca9b11c51a34" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.706957 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" (UID: "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.707073 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" (UID: "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.720999 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-scripts" (OuterVolumeSpecName: "scripts") pod "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" (UID: "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.736032 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-kube-api-access-gc4zf" (OuterVolumeSpecName: "kube-api-access-gc4zf") pod "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" (UID: "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea"). InnerVolumeSpecName "kube-api-access-gc4zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.741317 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.037559394 podStartE2EDuration="12.741293865s" podCreationTimestamp="2025-12-08 20:24:40 +0000 UTC" firstStartedPulling="2025-12-08 20:24:41.510358186 +0000 UTC m=+1197.661641563" lastFinishedPulling="2025-12-08 20:24:52.214092657 +0000 UTC m=+1208.365376034" observedRunningTime="2025-12-08 20:24:52.698383743 +0000 UTC m=+1208.849667120" watchObservedRunningTime="2025-12-08 20:24:52.741293865 +0000 UTC m=+1208.892577242" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.750466 4781 scope.go:117] "RemoveContainer" containerID="23541fb6775a7a3e1833ef55a404b04d07f4c2a4e1c2f965fc435a61169a8950" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.761849 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" (UID: "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.769042 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-dplq8"] Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.784031 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-55d57ffb96-9nv59"] Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.791607 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-55d57ffb96-9nv59"] Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.805343 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc4zf\" (UniqueName: \"kubernetes.io/projected/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-kube-api-access-gc4zf\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.805371 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.805381 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.805391 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.805400 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.866543 4781 scope.go:117] "RemoveContainer" containerID="4532f922a9104106b8fcfaab2f400ba80870604c911a442fb45360ed916ada7e" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.866537 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" (UID: "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.879082 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-config-data" (OuterVolumeSpecName: "config-data") pod "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" (UID: "1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.900212 4781 scope.go:117] "RemoveContainer" containerID="625ea809d498c0e401b8086b5ac3d0a4cdf7662262e4daa59b098426c120d4e1" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.908628 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.908681 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.945471 4781 scope.go:117] "RemoveContainer" containerID="2ce8da86a5c34c4bde70c84dab43dd46393f6d7dd4e26d5c08d6b8210f48d74b" Dec 08 20:24:52 crc kubenswrapper[4781]: I1208 20:24:52.946128 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-dae9-account-create-update-jrqz4"] Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.023812 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.060026 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.089080 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7dc689b5b9-6r5rd"] Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.111989 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:53 crc kubenswrapper[4781]: E1208 20:24:53.112472 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="ceilometer-central-agent" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.112500 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="ceilometer-central-agent" Dec 08 20:24:53 crc kubenswrapper[4781]: E1208 20:24:53.112522 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="sg-core" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.112531 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="sg-core" Dec 08 20:24:53 crc kubenswrapper[4781]: E1208 20:24:53.112565 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="proxy-httpd" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.112574 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="proxy-httpd" Dec 08 20:24:53 crc kubenswrapper[4781]: E1208 20:24:53.112590 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="ceilometer-notification-agent" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.112597 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="ceilometer-notification-agent" Dec 08 20:24:53 crc kubenswrapper[4781]: E1208 20:24:53.112620 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerName="barbican-api-log" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.112628 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerName="barbican-api-log" Dec 08 20:24:53 crc kubenswrapper[4781]: E1208 20:24:53.112642 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerName="barbican-api" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.112650 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerName="barbican-api" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.112894 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="proxy-httpd" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.112968 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="ceilometer-notification-agent" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.112991 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="ceilometer-central-agent" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.113009 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerName="barbican-api" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.113018 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" containerName="sg-core" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.113037 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerName="barbican-api-log" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.115186 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.125740 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.127430 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.127458 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.228222 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-run-httpd\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.228264 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gblwp\" (UniqueName: \"kubernetes.io/projected/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-kube-api-access-gblwp\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.228297 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.228374 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-config-data\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.228415 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-scripts\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.228429 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-log-httpd\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.228506 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.298952 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xlpd4"] Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.319953 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-057f-account-create-update-58gjv"] Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.330008 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-run-httpd\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.330052 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gblwp\" (UniqueName: \"kubernetes.io/projected/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-kube-api-access-gblwp\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.330087 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.330131 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-config-data\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.330164 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-scripts\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.330182 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-log-httpd\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.330264 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.330576 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-run-httpd\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.330592 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9c42-account-create-update-4r8td"] Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.330938 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-log-httpd\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.333739 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.334557 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-scripts\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.345749 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.345853 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-config-data\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.349305 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gblwp\" (UniqueName: \"kubernetes.io/projected/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-kube-api-access-gblwp\") pod \"ceilometer-0\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.361854 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-dvfwv"] Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.504906 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.720413 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9c42-account-create-update-4r8td" event={"ID":"2627247a-1d1b-4959-8b8f-e8750950b5ec","Type":"ContainerStarted","Data":"5ab4bfcdad7fd61ea73c71024fcb6730076ca69d37d7b41375d7ca23506130e5"} Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.732970 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dvfwv" event={"ID":"63220ecf-e9cb-45c3-b7a6-ac8a631cb22c","Type":"ContainerStarted","Data":"1633af84a6b552ebcfbe4e544734ef47d37047000849e3c86bc5850d040ac3a3"} Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.738100 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xlpd4" event={"ID":"0ada136d-baf8-411b-aa50-66edaf44a52b","Type":"ContainerStarted","Data":"c48d3b2347b8e4b05bd1defc6558212509d9b0f36a31dce3b3c5c8cc5df204d2"} Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.738228 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xlpd4" event={"ID":"0ada136d-baf8-411b-aa50-66edaf44a52b","Type":"ContainerStarted","Data":"173db89305be728135ec3dd1ba3f938c7855e0689e44b3b45fefa4c0e76ff2aa"} Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.741424 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-9c42-account-create-update-4r8td" podStartSLOduration=4.741404142 podStartE2EDuration="4.741404142s" podCreationTimestamp="2025-12-08 20:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:53.737813879 +0000 UTC m=+1209.889097256" watchObservedRunningTime="2025-12-08 20:24:53.741404142 +0000 UTC m=+1209.892687519" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.743643 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-057f-account-create-update-58gjv" event={"ID":"12f129eb-3261-41d7-99e4-a6de53a7fd31","Type":"ContainerStarted","Data":"04b072cbdc448525d4032344beb4b0af25b39cbad4e1101418d12d518f299ddc"} Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.746094 4781 generic.go:334] "Generic (PLEG): container finished" podID="4532eac0-b6a8-4560-bb45-ba9b78cc4eb4" containerID="0e108d620cb33dc584a0fa26bba94baa9648d815c3969c6c58b0e8c086d9ddab" exitCode=0 Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.746156 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dplq8" event={"ID":"4532eac0-b6a8-4560-bb45-ba9b78cc4eb4","Type":"ContainerDied","Data":"0e108d620cb33dc584a0fa26bba94baa9648d815c3969c6c58b0e8c086d9ddab"} Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.746177 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dplq8" event={"ID":"4532eac0-b6a8-4560-bb45-ba9b78cc4eb4","Type":"ContainerStarted","Data":"28bec363abeec28e0902e8c03221c4d9390f76e9727ea4a5adf6a40dce0aa047"} Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.749794 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7dc689b5b9-6r5rd" event={"ID":"d6925de4-2c7d-43cb-b2a9-1ec66f56b007","Type":"ContainerStarted","Data":"2ae1d10e52f764da4c7b05a8f732429fd4f9f4dc10502b7886fa2874f59d4633"} Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.755608 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dae9-account-create-update-jrqz4" event={"ID":"afd06929-88eb-4520-a1e9-97c32d3c2223","Type":"ContainerStarted","Data":"2903d9d595f474a0edbe637cfe5462f526f70f335210ec3c44714dc6b4ce703d"} Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.755670 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dae9-account-create-update-jrqz4" event={"ID":"afd06929-88eb-4520-a1e9-97c32d3c2223","Type":"ContainerStarted","Data":"761cdac4eb3623fea48d9be5e67fe8eab3663c822b4693c73507d5f00aa2f310"} Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.776324 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-xlpd4" podStartSLOduration=4.776304575 podStartE2EDuration="4.776304575s" podCreationTimestamp="2025-12-08 20:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:53.754832648 +0000 UTC m=+1209.906116025" watchObservedRunningTime="2025-12-08 20:24:53.776304575 +0000 UTC m=+1209.927587952" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.803878 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-057f-account-create-update-58gjv" podStartSLOduration=4.803859117 podStartE2EDuration="4.803859117s" podCreationTimestamp="2025-12-08 20:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:53.788482315 +0000 UTC m=+1209.939765692" watchObservedRunningTime="2025-12-08 20:24:53.803859117 +0000 UTC m=+1209.955142494" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.817344 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-dae9-account-create-update-jrqz4" podStartSLOduration=4.817321853 podStartE2EDuration="4.817321853s" podCreationTimestamp="2025-12-08 20:24:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:53.806156233 +0000 UTC m=+1209.957439600" watchObservedRunningTime="2025-12-08 20:24:53.817321853 +0000 UTC m=+1209.968605240" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.878363 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55d57ffb96-9nv59" podUID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 08 20:24:53 crc kubenswrapper[4781]: I1208 20:24:53.878720 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-55d57ffb96-9nv59" podUID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.104005 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.186361 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea" path="/var/lib/kubelet/pods/1ad35b7e-59ff-4dd5-85d4-de9c0b62b8ea/volumes" Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.188164 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73418976-c6e9-4ba9-bbd6-145ffa3d5ec3" path="/var/lib/kubelet/pods/73418976-c6e9-4ba9-bbd6-145ffa3d5ec3/volumes" Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.768828 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308","Type":"ContainerStarted","Data":"443297816a1b9998069809f7a6ec573f7ec43589594731d0db706f8d0b01ed81"} Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.771110 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7dc689b5b9-6r5rd" event={"ID":"d6925de4-2c7d-43cb-b2a9-1ec66f56b007","Type":"ContainerStarted","Data":"d0d03de8d8f3e0726a67b5dd09421463125a1e36a73f0cbf27d8c0aafaac769c"} Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.771166 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7dc689b5b9-6r5rd" event={"ID":"d6925de4-2c7d-43cb-b2a9-1ec66f56b007","Type":"ContainerStarted","Data":"f373031f5b6086be6ef41298bc57873a59db6c5bcb1d8f4d1dc31a7f933e8c87"} Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.771501 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.776548 4781 generic.go:334] "Generic (PLEG): container finished" podID="afd06929-88eb-4520-a1e9-97c32d3c2223" containerID="2903d9d595f474a0edbe637cfe5462f526f70f335210ec3c44714dc6b4ce703d" exitCode=0 Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.776639 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dae9-account-create-update-jrqz4" event={"ID":"afd06929-88eb-4520-a1e9-97c32d3c2223","Type":"ContainerDied","Data":"2903d9d595f474a0edbe637cfe5462f526f70f335210ec3c44714dc6b4ce703d"} Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.781109 4781 generic.go:334] "Generic (PLEG): container finished" podID="2627247a-1d1b-4959-8b8f-e8750950b5ec" containerID="c5fe51dde9ae8ae7b6ad5d42fea29b824b836a3924cd6b84fe8106110b673ef5" exitCode=0 Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.781167 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9c42-account-create-update-4r8td" event={"ID":"2627247a-1d1b-4959-8b8f-e8750950b5ec","Type":"ContainerDied","Data":"c5fe51dde9ae8ae7b6ad5d42fea29b824b836a3924cd6b84fe8106110b673ef5"} Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.787765 4781 generic.go:334] "Generic (PLEG): container finished" podID="63220ecf-e9cb-45c3-b7a6-ac8a631cb22c" containerID="7c9115503c2e7d7c25104a8e3be24dbba80b8d8ca0d1ecd56d371c61323d2096" exitCode=0 Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.787962 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dvfwv" event={"ID":"63220ecf-e9cb-45c3-b7a6-ac8a631cb22c","Type":"ContainerDied","Data":"7c9115503c2e7d7c25104a8e3be24dbba80b8d8ca0d1ecd56d371c61323d2096"} Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.793314 4781 generic.go:334] "Generic (PLEG): container finished" podID="0ada136d-baf8-411b-aa50-66edaf44a52b" containerID="c48d3b2347b8e4b05bd1defc6558212509d9b0f36a31dce3b3c5c8cc5df204d2" exitCode=0 Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.793385 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xlpd4" event={"ID":"0ada136d-baf8-411b-aa50-66edaf44a52b","Type":"ContainerDied","Data":"c48d3b2347b8e4b05bd1defc6558212509d9b0f36a31dce3b3c5c8cc5df204d2"} Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.799369 4781 generic.go:334] "Generic (PLEG): container finished" podID="12f129eb-3261-41d7-99e4-a6de53a7fd31" containerID="2426533ca9672cf99917ddb8d906aea5468713284691d9e893e68df66a2ec1d9" exitCode=0 Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.799569 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-057f-account-create-update-58gjv" event={"ID":"12f129eb-3261-41d7-99e4-a6de53a7fd31","Type":"ContainerDied","Data":"2426533ca9672cf99917ddb8d906aea5468713284691d9e893e68df66a2ec1d9"} Dec 08 20:24:54 crc kubenswrapper[4781]: I1208 20:24:54.813308 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7dc689b5b9-6r5rd" podStartSLOduration=8.81328886 podStartE2EDuration="8.81328886s" podCreationTimestamp="2025-12-08 20:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:24:54.81050353 +0000 UTC m=+1210.961786917" watchObservedRunningTime="2025-12-08 20:24:54.81328886 +0000 UTC m=+1210.964572237" Dec 08 20:24:55 crc kubenswrapper[4781]: I1208 20:24:55.215738 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dplq8" Dec 08 20:24:55 crc kubenswrapper[4781]: I1208 20:24:55.277108 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqxnb\" (UniqueName: \"kubernetes.io/projected/4532eac0-b6a8-4560-bb45-ba9b78cc4eb4-kube-api-access-lqxnb\") pod \"4532eac0-b6a8-4560-bb45-ba9b78cc4eb4\" (UID: \"4532eac0-b6a8-4560-bb45-ba9b78cc4eb4\") " Dec 08 20:24:55 crc kubenswrapper[4781]: I1208 20:24:55.277280 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4532eac0-b6a8-4560-bb45-ba9b78cc4eb4-operator-scripts\") pod \"4532eac0-b6a8-4560-bb45-ba9b78cc4eb4\" (UID: \"4532eac0-b6a8-4560-bb45-ba9b78cc4eb4\") " Dec 08 20:24:55 crc kubenswrapper[4781]: I1208 20:24:55.279098 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4532eac0-b6a8-4560-bb45-ba9b78cc4eb4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4532eac0-b6a8-4560-bb45-ba9b78cc4eb4" (UID: "4532eac0-b6a8-4560-bb45-ba9b78cc4eb4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:55 crc kubenswrapper[4781]: I1208 20:24:55.302590 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4532eac0-b6a8-4560-bb45-ba9b78cc4eb4-kube-api-access-lqxnb" (OuterVolumeSpecName: "kube-api-access-lqxnb") pod "4532eac0-b6a8-4560-bb45-ba9b78cc4eb4" (UID: "4532eac0-b6a8-4560-bb45-ba9b78cc4eb4"). InnerVolumeSpecName "kube-api-access-lqxnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:55 crc kubenswrapper[4781]: I1208 20:24:55.380554 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4532eac0-b6a8-4560-bb45-ba9b78cc4eb4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:55 crc kubenswrapper[4781]: I1208 20:24:55.380580 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqxnb\" (UniqueName: \"kubernetes.io/projected/4532eac0-b6a8-4560-bb45-ba9b78cc4eb4-kube-api-access-lqxnb\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:55 crc kubenswrapper[4781]: I1208 20:24:55.809981 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308","Type":"ContainerStarted","Data":"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d"} Dec 08 20:24:55 crc kubenswrapper[4781]: I1208 20:24:55.810288 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308","Type":"ContainerStarted","Data":"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6"} Dec 08 20:24:55 crc kubenswrapper[4781]: I1208 20:24:55.811936 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-dplq8" Dec 08 20:24:55 crc kubenswrapper[4781]: I1208 20:24:55.813046 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-dplq8" event={"ID":"4532eac0-b6a8-4560-bb45-ba9b78cc4eb4","Type":"ContainerDied","Data":"28bec363abeec28e0902e8c03221c4d9390f76e9727ea4a5adf6a40dce0aa047"} Dec 08 20:24:55 crc kubenswrapper[4781]: I1208 20:24:55.813095 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28bec363abeec28e0902e8c03221c4d9390f76e9727ea4a5adf6a40dce0aa047" Dec 08 20:24:55 crc kubenswrapper[4781]: I1208 20:24:55.813750 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.203640 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.255047 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dae9-account-create-update-jrqz4" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.307859 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zvqp\" (UniqueName: \"kubernetes.io/projected/afd06929-88eb-4520-a1e9-97c32d3c2223-kube-api-access-4zvqp\") pod \"afd06929-88eb-4520-a1e9-97c32d3c2223\" (UID: \"afd06929-88eb-4520-a1e9-97c32d3c2223\") " Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.308126 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afd06929-88eb-4520-a1e9-97c32d3c2223-operator-scripts\") pod \"afd06929-88eb-4520-a1e9-97c32d3c2223\" (UID: \"afd06929-88eb-4520-a1e9-97c32d3c2223\") " Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.311360 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd06929-88eb-4520-a1e9-97c32d3c2223-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "afd06929-88eb-4520-a1e9-97c32d3c2223" (UID: "afd06929-88eb-4520-a1e9-97c32d3c2223"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.328343 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afd06929-88eb-4520-a1e9-97c32d3c2223-kube-api-access-4zvqp" (OuterVolumeSpecName: "kube-api-access-4zvqp") pod "afd06929-88eb-4520-a1e9-97c32d3c2223" (UID: "afd06929-88eb-4520-a1e9-97c32d3c2223"). InnerVolumeSpecName "kube-api-access-4zvqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.416599 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afd06929-88eb-4520-a1e9-97c32d3c2223-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.416631 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zvqp\" (UniqueName: \"kubernetes.io/projected/afd06929-88eb-4520-a1e9-97c32d3c2223-kube-api-access-4zvqp\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.417031 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xlpd4" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.517673 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9bdm\" (UniqueName: \"kubernetes.io/projected/0ada136d-baf8-411b-aa50-66edaf44a52b-kube-api-access-x9bdm\") pod \"0ada136d-baf8-411b-aa50-66edaf44a52b\" (UID: \"0ada136d-baf8-411b-aa50-66edaf44a52b\") " Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.518056 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ada136d-baf8-411b-aa50-66edaf44a52b-operator-scripts\") pod \"0ada136d-baf8-411b-aa50-66edaf44a52b\" (UID: \"0ada136d-baf8-411b-aa50-66edaf44a52b\") " Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.518540 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ada136d-baf8-411b-aa50-66edaf44a52b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0ada136d-baf8-411b-aa50-66edaf44a52b" (UID: "0ada136d-baf8-411b-aa50-66edaf44a52b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.518657 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ada136d-baf8-411b-aa50-66edaf44a52b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.521571 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ada136d-baf8-411b-aa50-66edaf44a52b-kube-api-access-x9bdm" (OuterVolumeSpecName: "kube-api-access-x9bdm") pod "0ada136d-baf8-411b-aa50-66edaf44a52b" (UID: "0ada136d-baf8-411b-aa50-66edaf44a52b"). InnerVolumeSpecName "kube-api-access-x9bdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.541105 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-057f-account-create-update-58gjv" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.568644 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dvfwv" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.592024 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9c42-account-create-update-4r8td" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.619901 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12f129eb-3261-41d7-99e4-a6de53a7fd31-operator-scripts\") pod \"12f129eb-3261-41d7-99e4-a6de53a7fd31\" (UID: \"12f129eb-3261-41d7-99e4-a6de53a7fd31\") " Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.619995 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66fks\" (UniqueName: \"kubernetes.io/projected/12f129eb-3261-41d7-99e4-a6de53a7fd31-kube-api-access-66fks\") pod \"12f129eb-3261-41d7-99e4-a6de53a7fd31\" (UID: \"12f129eb-3261-41d7-99e4-a6de53a7fd31\") " Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.620091 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8wd6\" (UniqueName: \"kubernetes.io/projected/63220ecf-e9cb-45c3-b7a6-ac8a631cb22c-kube-api-access-q8wd6\") pod \"63220ecf-e9cb-45c3-b7a6-ac8a631cb22c\" (UID: \"63220ecf-e9cb-45c3-b7a6-ac8a631cb22c\") " Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.620254 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63220ecf-e9cb-45c3-b7a6-ac8a631cb22c-operator-scripts\") pod \"63220ecf-e9cb-45c3-b7a6-ac8a631cb22c\" (UID: \"63220ecf-e9cb-45c3-b7a6-ac8a631cb22c\") " Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.620758 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9bdm\" (UniqueName: \"kubernetes.io/projected/0ada136d-baf8-411b-aa50-66edaf44a52b-kube-api-access-x9bdm\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.621160 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63220ecf-e9cb-45c3-b7a6-ac8a631cb22c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63220ecf-e9cb-45c3-b7a6-ac8a631cb22c" (UID: "63220ecf-e9cb-45c3-b7a6-ac8a631cb22c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.624265 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63220ecf-e9cb-45c3-b7a6-ac8a631cb22c-kube-api-access-q8wd6" (OuterVolumeSpecName: "kube-api-access-q8wd6") pod "63220ecf-e9cb-45c3-b7a6-ac8a631cb22c" (UID: "63220ecf-e9cb-45c3-b7a6-ac8a631cb22c"). InnerVolumeSpecName "kube-api-access-q8wd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.624461 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12f129eb-3261-41d7-99e4-a6de53a7fd31-kube-api-access-66fks" (OuterVolumeSpecName: "kube-api-access-66fks") pod "12f129eb-3261-41d7-99e4-a6de53a7fd31" (UID: "12f129eb-3261-41d7-99e4-a6de53a7fd31"). InnerVolumeSpecName "kube-api-access-66fks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.627437 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12f129eb-3261-41d7-99e4-a6de53a7fd31-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12f129eb-3261-41d7-99e4-a6de53a7fd31" (UID: "12f129eb-3261-41d7-99e4-a6de53a7fd31"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.721393 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmvkn\" (UniqueName: \"kubernetes.io/projected/2627247a-1d1b-4959-8b8f-e8750950b5ec-kube-api-access-zmvkn\") pod \"2627247a-1d1b-4959-8b8f-e8750950b5ec\" (UID: \"2627247a-1d1b-4959-8b8f-e8750950b5ec\") " Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.721538 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2627247a-1d1b-4959-8b8f-e8750950b5ec-operator-scripts\") pod \"2627247a-1d1b-4959-8b8f-e8750950b5ec\" (UID: \"2627247a-1d1b-4959-8b8f-e8750950b5ec\") " Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.721965 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12f129eb-3261-41d7-99e4-a6de53a7fd31-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.721979 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66fks\" (UniqueName: \"kubernetes.io/projected/12f129eb-3261-41d7-99e4-a6de53a7fd31-kube-api-access-66fks\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.721988 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8wd6\" (UniqueName: \"kubernetes.io/projected/63220ecf-e9cb-45c3-b7a6-ac8a631cb22c-kube-api-access-q8wd6\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.721997 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63220ecf-e9cb-45c3-b7a6-ac8a631cb22c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.723052 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2627247a-1d1b-4959-8b8f-e8750950b5ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2627247a-1d1b-4959-8b8f-e8750950b5ec" (UID: "2627247a-1d1b-4959-8b8f-e8750950b5ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.725188 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2627247a-1d1b-4959-8b8f-e8750950b5ec-kube-api-access-zmvkn" (OuterVolumeSpecName: "kube-api-access-zmvkn") pod "2627247a-1d1b-4959-8b8f-e8750950b5ec" (UID: "2627247a-1d1b-4959-8b8f-e8750950b5ec"). InnerVolumeSpecName "kube-api-access-zmvkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.819151 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-dae9-account-create-update-jrqz4" event={"ID":"afd06929-88eb-4520-a1e9-97c32d3c2223","Type":"ContainerDied","Data":"761cdac4eb3623fea48d9be5e67fe8eab3663c822b4693c73507d5f00aa2f310"} Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.819187 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="761cdac4eb3623fea48d9be5e67fe8eab3663c822b4693c73507d5f00aa2f310" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.819218 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-dae9-account-create-update-jrqz4" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.823396 4781 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2627247a-1d1b-4959-8b8f-e8750950b5ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.823430 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmvkn\" (UniqueName: \"kubernetes.io/projected/2627247a-1d1b-4959-8b8f-e8750950b5ec-kube-api-access-zmvkn\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.827142 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9c42-account-create-update-4r8td" event={"ID":"2627247a-1d1b-4959-8b8f-e8750950b5ec","Type":"ContainerDied","Data":"5ab4bfcdad7fd61ea73c71024fcb6730076ca69d37d7b41375d7ca23506130e5"} Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.827180 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ab4bfcdad7fd61ea73c71024fcb6730076ca69d37d7b41375d7ca23506130e5" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.827159 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9c42-account-create-update-4r8td" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.828553 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-dvfwv" event={"ID":"63220ecf-e9cb-45c3-b7a6-ac8a631cb22c","Type":"ContainerDied","Data":"1633af84a6b552ebcfbe4e544734ef47d37047000849e3c86bc5850d040ac3a3"} Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.828579 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-dvfwv" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.828585 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1633af84a6b552ebcfbe4e544734ef47d37047000849e3c86bc5850d040ac3a3" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.829655 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xlpd4" event={"ID":"0ada136d-baf8-411b-aa50-66edaf44a52b","Type":"ContainerDied","Data":"173db89305be728135ec3dd1ba3f938c7855e0689e44b3b45fefa4c0e76ff2aa"} Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.829680 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="173db89305be728135ec3dd1ba3f938c7855e0689e44b3b45fefa4c0e76ff2aa" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.829729 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xlpd4" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.833156 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-057f-account-create-update-58gjv" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.833168 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-057f-account-create-update-58gjv" event={"ID":"12f129eb-3261-41d7-99e4-a6de53a7fd31","Type":"ContainerDied","Data":"04b072cbdc448525d4032344beb4b0af25b39cbad4e1101418d12d518f299ddc"} Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.833207 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04b072cbdc448525d4032344beb4b0af25b39cbad4e1101418d12d518f299ddc" Dec 08 20:24:56 crc kubenswrapper[4781]: I1208 20:24:56.836514 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308","Type":"ContainerStarted","Data":"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305"} Dec 08 20:24:57 crc kubenswrapper[4781]: I1208 20:24:57.846137 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308","Type":"ContainerStarted","Data":"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2"} Dec 08 20:24:57 crc kubenswrapper[4781]: I1208 20:24:57.846492 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 20:24:57 crc kubenswrapper[4781]: I1208 20:24:57.846363 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="proxy-httpd" containerID="cri-o://64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2" gracePeriod=30 Dec 08 20:24:57 crc kubenswrapper[4781]: I1208 20:24:57.846417 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="ceilometer-notification-agent" containerID="cri-o://10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d" gracePeriod=30 Dec 08 20:24:57 crc kubenswrapper[4781]: I1208 20:24:57.846434 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="sg-core" containerID="cri-o://3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305" gracePeriod=30 Dec 08 20:24:57 crc kubenswrapper[4781]: I1208 20:24:57.846306 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="ceilometer-central-agent" containerID="cri-o://4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6" gracePeriod=30 Dec 08 20:24:57 crc kubenswrapper[4781]: I1208 20:24:57.871369 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.535391314 podStartE2EDuration="4.871332547s" podCreationTimestamp="2025-12-08 20:24:53 +0000 UTC" firstStartedPulling="2025-12-08 20:24:54.103157205 +0000 UTC m=+1210.254440582" lastFinishedPulling="2025-12-08 20:24:57.439098438 +0000 UTC m=+1213.590381815" observedRunningTime="2025-12-08 20:24:57.868776833 +0000 UTC m=+1214.020060230" watchObservedRunningTime="2025-12-08 20:24:57.871332547 +0000 UTC m=+1214.022615924" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.610296 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.662047 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-config-data\") pod \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.662144 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-sg-core-conf-yaml\") pod \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.662232 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-combined-ca-bundle\") pod \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.662251 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gblwp\" (UniqueName: \"kubernetes.io/projected/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-kube-api-access-gblwp\") pod \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.662297 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-log-httpd\") pod \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.662335 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-run-httpd\") pod \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.662363 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-scripts\") pod \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\" (UID: \"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308\") " Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.663570 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" (UID: "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.664179 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" (UID: "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.668772 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-scripts" (OuterVolumeSpecName: "scripts") pod "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" (UID: "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.670063 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-kube-api-access-gblwp" (OuterVolumeSpecName: "kube-api-access-gblwp") pod "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" (UID: "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308"). InnerVolumeSpecName "kube-api-access-gblwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.695437 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" (UID: "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.735624 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" (UID: "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.756657 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-config-data" (OuterVolumeSpecName: "config-data") pod "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" (UID: "5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.764783 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.764820 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.764833 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.764843 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gblwp\" (UniqueName: \"kubernetes.io/projected/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-kube-api-access-gblwp\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.764854 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.764864 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.764873 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.857969 4781 generic.go:334] "Generic (PLEG): container finished" podID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerID="64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2" exitCode=0 Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.858009 4781 generic.go:334] "Generic (PLEG): container finished" podID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerID="3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305" exitCode=2 Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.858016 4781 generic.go:334] "Generic (PLEG): container finished" podID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerID="10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d" exitCode=0 Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.858023 4781 generic.go:334] "Generic (PLEG): container finished" podID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerID="4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6" exitCode=0 Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.858055 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308","Type":"ContainerDied","Data":"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2"} Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.858111 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308","Type":"ContainerDied","Data":"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305"} Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.858123 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308","Type":"ContainerDied","Data":"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d"} Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.858133 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308","Type":"ContainerDied","Data":"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6"} Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.858141 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308","Type":"ContainerDied","Data":"443297816a1b9998069809f7a6ec573f7ec43589594731d0db706f8d0b01ed81"} Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.858138 4781 scope.go:117] "RemoveContainer" containerID="64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.859182 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.887150 4781 scope.go:117] "RemoveContainer" containerID="3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.902099 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.911598 4781 scope.go:117] "RemoveContainer" containerID="10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.916849 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.932818 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.933447 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="sg-core" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.933526 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="sg-core" Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.933600 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="ceilometer-notification-agent" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.933674 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="ceilometer-notification-agent" Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.933741 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12f129eb-3261-41d7-99e4-a6de53a7fd31" containerName="mariadb-account-create-update" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.933824 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="12f129eb-3261-41d7-99e4-a6de53a7fd31" containerName="mariadb-account-create-update" Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.933887 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="proxy-httpd" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.933964 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="proxy-httpd" Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.934027 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="ceilometer-central-agent" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.934091 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="ceilometer-central-agent" Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.934145 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2627247a-1d1b-4959-8b8f-e8750950b5ec" containerName="mariadb-account-create-update" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.934199 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2627247a-1d1b-4959-8b8f-e8750950b5ec" containerName="mariadb-account-create-update" Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.934256 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4532eac0-b6a8-4560-bb45-ba9b78cc4eb4" containerName="mariadb-database-create" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.934311 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4532eac0-b6a8-4560-bb45-ba9b78cc4eb4" containerName="mariadb-database-create" Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.934401 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd06929-88eb-4520-a1e9-97c32d3c2223" containerName="mariadb-account-create-update" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.934477 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd06929-88eb-4520-a1e9-97c32d3c2223" containerName="mariadb-account-create-update" Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.934541 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63220ecf-e9cb-45c3-b7a6-ac8a631cb22c" containerName="mariadb-database-create" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.934596 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="63220ecf-e9cb-45c3-b7a6-ac8a631cb22c" containerName="mariadb-database-create" Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.934650 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ada136d-baf8-411b-aa50-66edaf44a52b" containerName="mariadb-database-create" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.934703 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ada136d-baf8-411b-aa50-66edaf44a52b" containerName="mariadb-database-create" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.934965 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4532eac0-b6a8-4560-bb45-ba9b78cc4eb4" containerName="mariadb-database-create" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.935049 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="63220ecf-e9cb-45c3-b7a6-ac8a631cb22c" containerName="mariadb-database-create" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.935125 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="12f129eb-3261-41d7-99e4-a6de53a7fd31" containerName="mariadb-account-create-update" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.935213 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ada136d-baf8-411b-aa50-66edaf44a52b" containerName="mariadb-database-create" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.935293 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2627247a-1d1b-4959-8b8f-e8750950b5ec" containerName="mariadb-account-create-update" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.935355 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="sg-core" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.935724 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="proxy-httpd" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.936810 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="afd06929-88eb-4520-a1e9-97c32d3c2223" containerName="mariadb-account-create-update" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.936960 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="ceilometer-notification-agent" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.937061 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" containerName="ceilometer-central-agent" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.940356 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.946168 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.946034 4781 scope.go:117] "RemoveContainer" containerID="4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.946368 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.965126 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.973789 4781 scope.go:117] "RemoveContainer" containerID="64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2" Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.974204 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2\": container with ID starting with 64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2 not found: ID does not exist" containerID="64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.974229 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2"} err="failed to get container status \"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2\": rpc error: code = NotFound desc = could not find container \"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2\": container with ID starting with 64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2 not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.974247 4781 scope.go:117] "RemoveContainer" containerID="3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305" Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.974545 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305\": container with ID starting with 3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305 not found: ID does not exist" containerID="3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.974566 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305"} err="failed to get container status \"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305\": rpc error: code = NotFound desc = could not find container \"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305\": container with ID starting with 3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305 not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.974581 4781 scope.go:117] "RemoveContainer" containerID="10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d" Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.974737 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d\": container with ID starting with 10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d not found: ID does not exist" containerID="10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.974754 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d"} err="failed to get container status \"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d\": rpc error: code = NotFound desc = could not find container \"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d\": container with ID starting with 10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.974765 4781 scope.go:117] "RemoveContainer" containerID="4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6" Dec 08 20:24:58 crc kubenswrapper[4781]: E1208 20:24:58.974951 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6\": container with ID starting with 4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6 not found: ID does not exist" containerID="4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.974966 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6"} err="failed to get container status \"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6\": rpc error: code = NotFound desc = could not find container \"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6\": container with ID starting with 4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6 not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.974978 4781 scope.go:117] "RemoveContainer" containerID="64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.975147 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2"} err="failed to get container status \"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2\": rpc error: code = NotFound desc = could not find container \"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2\": container with ID starting with 64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2 not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.975163 4781 scope.go:117] "RemoveContainer" containerID="3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.975379 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305"} err="failed to get container status \"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305\": rpc error: code = NotFound desc = could not find container \"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305\": container with ID starting with 3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305 not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.975398 4781 scope.go:117] "RemoveContainer" containerID="10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.975604 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d"} err="failed to get container status \"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d\": rpc error: code = NotFound desc = could not find container \"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d\": container with ID starting with 10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.975624 4781 scope.go:117] "RemoveContainer" containerID="4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.975779 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6"} err="failed to get container status \"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6\": rpc error: code = NotFound desc = could not find container \"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6\": container with ID starting with 4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6 not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.975801 4781 scope.go:117] "RemoveContainer" containerID="64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.975996 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2"} err="failed to get container status \"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2\": rpc error: code = NotFound desc = could not find container \"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2\": container with ID starting with 64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2 not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.976015 4781 scope.go:117] "RemoveContainer" containerID="3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.976271 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305"} err="failed to get container status \"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305\": rpc error: code = NotFound desc = could not find container \"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305\": container with ID starting with 3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305 not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.976286 4781 scope.go:117] "RemoveContainer" containerID="10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.976505 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d"} err="failed to get container status \"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d\": rpc error: code = NotFound desc = could not find container \"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d\": container with ID starting with 10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.976523 4781 scope.go:117] "RemoveContainer" containerID="4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.976750 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6"} err="failed to get container status \"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6\": rpc error: code = NotFound desc = could not find container \"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6\": container with ID starting with 4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6 not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.976773 4781 scope.go:117] "RemoveContainer" containerID="64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.977014 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2"} err="failed to get container status \"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2\": rpc error: code = NotFound desc = could not find container \"64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2\": container with ID starting with 64f8601e2e86729e615c63022ef45c4e4acbfbaa2040ab20c55498dd9838c0d2 not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.977029 4781 scope.go:117] "RemoveContainer" containerID="3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.977404 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305"} err="failed to get container status \"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305\": rpc error: code = NotFound desc = could not find container \"3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305\": container with ID starting with 3508a4d1215708512a3ee8b4920dcc55c023dab6091315bb211280127f6dc305 not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.977420 4781 scope.go:117] "RemoveContainer" containerID="10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.977630 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d"} err="failed to get container status \"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d\": rpc error: code = NotFound desc = could not find container \"10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d\": container with ID starting with 10c63234af85da8df9c2c081972649fc40748baf32f5b54834281a71d23dd47d not found: ID does not exist" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.977644 4781 scope.go:117] "RemoveContainer" containerID="4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6" Dec 08 20:24:58 crc kubenswrapper[4781]: I1208 20:24:58.977848 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6"} err="failed to get container status \"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6\": rpc error: code = NotFound desc = could not find container \"4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6\": container with ID starting with 4e96c9e9f3a5e2785a3617d4d03b7d1358fc8c89cf909cb64fcadb0f28ac85c6 not found: ID does not exist" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.070042 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4svz9\" (UniqueName: \"kubernetes.io/projected/82caa489-b266-4d63-b903-b88755343da6-kube-api-access-4svz9\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.070094 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.070117 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.070237 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82caa489-b266-4d63-b903-b88755343da6-log-httpd\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.070279 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-config-data\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.070308 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82caa489-b266-4d63-b903-b88755343da6-run-httpd\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.070590 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-scripts\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.172835 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4svz9\" (UniqueName: \"kubernetes.io/projected/82caa489-b266-4d63-b903-b88755343da6-kube-api-access-4svz9\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.172889 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.172930 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.172972 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82caa489-b266-4d63-b903-b88755343da6-log-httpd\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.172996 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-config-data\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.173024 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82caa489-b266-4d63-b903-b88755343da6-run-httpd\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.173152 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-scripts\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.173914 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82caa489-b266-4d63-b903-b88755343da6-run-httpd\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.174047 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82caa489-b266-4d63-b903-b88755343da6-log-httpd\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.179157 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-scripts\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.179396 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.186812 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.187932 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-config-data\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.195162 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4svz9\" (UniqueName: \"kubernetes.io/projected/82caa489-b266-4d63-b903-b88755343da6-kube-api-access-4svz9\") pod \"ceilometer-0\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.272744 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.703073 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.869202 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82caa489-b266-4d63-b903-b88755343da6","Type":"ContainerStarted","Data":"6e95ab9cc6205ba990990920881bd0ebff0e38b2dffa0b6a18e80ba66feda775"} Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.926182 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-66bd65f4cd-rdlc7" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.926313 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.947729 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:24:59 crc kubenswrapper[4781]: I1208 20:24:59.947789 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.030695 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zrfdd"] Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.032451 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.035393 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.035500 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.035544 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dnkn2" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.050434 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zrfdd"] Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.097428 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-config-data\") pod \"nova-cell0-conductor-db-sync-zrfdd\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.097492 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-scripts\") pod \"nova-cell0-conductor-db-sync-zrfdd\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.097516 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zrfdd\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.097572 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wcx9\" (UniqueName: \"kubernetes.io/projected/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-kube-api-access-6wcx9\") pod \"nova-cell0-conductor-db-sync-zrfdd\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.141454 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308" path="/var/lib/kubelet/pods/5173cf2e-3dcf-4eb9-8dbb-17f3c3b43308/volumes" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.199243 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-config-data\") pod \"nova-cell0-conductor-db-sync-zrfdd\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.199313 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-scripts\") pod \"nova-cell0-conductor-db-sync-zrfdd\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.199340 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zrfdd\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.199395 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wcx9\" (UniqueName: \"kubernetes.io/projected/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-kube-api-access-6wcx9\") pod \"nova-cell0-conductor-db-sync-zrfdd\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.203054 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zrfdd\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.203957 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-scripts\") pod \"nova-cell0-conductor-db-sync-zrfdd\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.205144 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-config-data\") pod \"nova-cell0-conductor-db-sync-zrfdd\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.217024 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wcx9\" (UniqueName: \"kubernetes.io/projected/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-kube-api-access-6wcx9\") pod \"nova-cell0-conductor-db-sync-zrfdd\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.385631 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.844140 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zrfdd"] Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.878949 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zrfdd" event={"ID":"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed","Type":"ContainerStarted","Data":"422f1b34bd189d3cdc81779501e69f9126939e8dbdfa1b04bdfb6bf3f1d7faca"} Dec 08 20:25:00 crc kubenswrapper[4781]: I1208 20:25:00.880257 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82caa489-b266-4d63-b903-b88755343da6","Type":"ContainerStarted","Data":"fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da"} Dec 08 20:25:01 crc kubenswrapper[4781]: I1208 20:25:01.890726 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82caa489-b266-4d63-b903-b88755343da6","Type":"ContainerStarted","Data":"8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e"} Dec 08 20:25:01 crc kubenswrapper[4781]: I1208 20:25:01.892088 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82caa489-b266-4d63-b903-b88755343da6","Type":"ContainerStarted","Data":"de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd"} Dec 08 20:25:02 crc kubenswrapper[4781]: I1208 20:25:02.250272 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:25:02 crc kubenswrapper[4781]: I1208 20:25:02.256302 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7dc689b5b9-6r5rd" Dec 08 20:25:03 crc kubenswrapper[4781]: I1208 20:25:03.916113 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82caa489-b266-4d63-b903-b88755343da6","Type":"ContainerStarted","Data":"a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf"} Dec 08 20:25:03 crc kubenswrapper[4781]: I1208 20:25:03.917115 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 20:25:03 crc kubenswrapper[4781]: I1208 20:25:03.942375 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.770652694 podStartE2EDuration="5.942351737s" podCreationTimestamp="2025-12-08 20:24:58 +0000 UTC" firstStartedPulling="2025-12-08 20:24:59.709822133 +0000 UTC m=+1215.861105510" lastFinishedPulling="2025-12-08 20:25:02.881521176 +0000 UTC m=+1219.032804553" observedRunningTime="2025-12-08 20:25:03.933360898 +0000 UTC m=+1220.084644295" watchObservedRunningTime="2025-12-08 20:25:03.942351737 +0000 UTC m=+1220.093635114" Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516249 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4532eac0_b6a8_4560_bb45_ba9b78cc4eb4.slice/crio-28bec363abeec28e0902e8c03221c4d9390f76e9727ea4a5adf6a40dce0aa047": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4532eac0_b6a8_4560_bb45_ba9b78cc4eb4.slice/crio-28bec363abeec28e0902e8c03221c4d9390f76e9727ea4a5adf6a40dce0aa047: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516510 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4532eac0_b6a8_4560_bb45_ba9b78cc4eb4.slice/crio-conmon-0e108d620cb33dc584a0fa26bba94baa9648d815c3969c6c58b0e8c086d9ddab.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4532eac0_b6a8_4560_bb45_ba9b78cc4eb4.slice/crio-conmon-0e108d620cb33dc584a0fa26bba94baa9648d815c3969c6c58b0e8c086d9ddab.scope: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516527 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4532eac0_b6a8_4560_bb45_ba9b78cc4eb4.slice/crio-0e108d620cb33dc584a0fa26bba94baa9648d815c3969c6c58b0e8c086d9ddab.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4532eac0_b6a8_4560_bb45_ba9b78cc4eb4.slice/crio-0e108d620cb33dc584a0fa26bba94baa9648d815c3969c6c58b0e8c086d9ddab.scope: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516539 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd06929_88eb_4520_a1e9_97c32d3c2223.slice/crio-761cdac4eb3623fea48d9be5e67fe8eab3663c822b4693c73507d5f00aa2f310": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd06929_88eb_4520_a1e9_97c32d3c2223.slice/crio-761cdac4eb3623fea48d9be5e67fe8eab3663c822b4693c73507d5f00aa2f310: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516654 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd06929_88eb_4520_a1e9_97c32d3c2223.slice/crio-conmon-2903d9d595f474a0edbe637cfe5462f526f70f335210ec3c44714dc6b4ce703d.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd06929_88eb_4520_a1e9_97c32d3c2223.slice/crio-conmon-2903d9d595f474a0edbe637cfe5462f526f70f335210ec3c44714dc6b4ce703d.scope: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516672 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12f129eb_3261_41d7_99e4_a6de53a7fd31.slice/crio-04b072cbdc448525d4032344beb4b0af25b39cbad4e1101418d12d518f299ddc": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12f129eb_3261_41d7_99e4_a6de53a7fd31.slice/crio-04b072cbdc448525d4032344beb4b0af25b39cbad4e1101418d12d518f299ddc: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516687 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ada136d_baf8_411b_aa50_66edaf44a52b.slice/crio-173db89305be728135ec3dd1ba3f938c7855e0689e44b3b45fefa4c0e76ff2aa": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ada136d_baf8_411b_aa50_66edaf44a52b.slice/crio-173db89305be728135ec3dd1ba3f938c7855e0689e44b3b45fefa4c0e76ff2aa: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516702 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2627247a_1d1b_4959_8b8f_e8750950b5ec.slice/crio-5ab4bfcdad7fd61ea73c71024fcb6730076ca69d37d7b41375d7ca23506130e5": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2627247a_1d1b_4959_8b8f_e8750950b5ec.slice/crio-5ab4bfcdad7fd61ea73c71024fcb6730076ca69d37d7b41375d7ca23506130e5: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516716 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63220ecf_e9cb_45c3_b7a6_ac8a631cb22c.slice/crio-1633af84a6b552ebcfbe4e544734ef47d37047000849e3c86bc5850d040ac3a3": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63220ecf_e9cb_45c3_b7a6_ac8a631cb22c.slice/crio-1633af84a6b552ebcfbe4e544734ef47d37047000849e3c86bc5850d040ac3a3: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516730 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12f129eb_3261_41d7_99e4_a6de53a7fd31.slice/crio-conmon-2426533ca9672cf99917ddb8d906aea5468713284691d9e893e68df66a2ec1d9.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12f129eb_3261_41d7_99e4_a6de53a7fd31.slice/crio-conmon-2426533ca9672cf99917ddb8d906aea5468713284691d9e893e68df66a2ec1d9.scope: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516742 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ada136d_baf8_411b_aa50_66edaf44a52b.slice/crio-conmon-c48d3b2347b8e4b05bd1defc6558212509d9b0f36a31dce3b3c5c8cc5df204d2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ada136d_baf8_411b_aa50_66edaf44a52b.slice/crio-conmon-c48d3b2347b8e4b05bd1defc6558212509d9b0f36a31dce3b3c5c8cc5df204d2.scope: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516834 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd06929_88eb_4520_a1e9_97c32d3c2223.slice/crio-2903d9d595f474a0edbe637cfe5462f526f70f335210ec3c44714dc6b4ce703d.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd06929_88eb_4520_a1e9_97c32d3c2223.slice/crio-2903d9d595f474a0edbe637cfe5462f526f70f335210ec3c44714dc6b4ce703d.scope: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516854 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5173cf2e_3dcf_4eb9_8dbb_17f3c3b43308.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5173cf2e_3dcf_4eb9_8dbb_17f3c3b43308.slice: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516870 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63220ecf_e9cb_45c3_b7a6_ac8a631cb22c.slice/crio-conmon-7c9115503c2e7d7c25104a8e3be24dbba80b8d8ca0d1ecd56d371c61323d2096.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63220ecf_e9cb_45c3_b7a6_ac8a631cb22c.slice/crio-conmon-7c9115503c2e7d7c25104a8e3be24dbba80b8d8ca0d1ecd56d371c61323d2096.scope: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516884 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2627247a_1d1b_4959_8b8f_e8750950b5ec.slice/crio-conmon-c5fe51dde9ae8ae7b6ad5d42fea29b824b836a3924cd6b84fe8106110b673ef5.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2627247a_1d1b_4959_8b8f_e8750950b5ec.slice/crio-conmon-c5fe51dde9ae8ae7b6ad5d42fea29b824b836a3924cd6b84fe8106110b673ef5.scope: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516896 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12f129eb_3261_41d7_99e4_a6de53a7fd31.slice/crio-2426533ca9672cf99917ddb8d906aea5468713284691d9e893e68df66a2ec1d9.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12f129eb_3261_41d7_99e4_a6de53a7fd31.slice/crio-2426533ca9672cf99917ddb8d906aea5468713284691d9e893e68df66a2ec1d9.scope: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.516999 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ada136d_baf8_411b_aa50_66edaf44a52b.slice/crio-c48d3b2347b8e4b05bd1defc6558212509d9b0f36a31dce3b3c5c8cc5df204d2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ada136d_baf8_411b_aa50_66edaf44a52b.slice/crio-c48d3b2347b8e4b05bd1defc6558212509d9b0f36a31dce3b3c5c8cc5df204d2.scope: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.517014 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2627247a_1d1b_4959_8b8f_e8750950b5ec.slice/crio-c5fe51dde9ae8ae7b6ad5d42fea29b824b836a3924cd6b84fe8106110b673ef5.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2627247a_1d1b_4959_8b8f_e8750950b5ec.slice/crio-c5fe51dde9ae8ae7b6ad5d42fea29b824b836a3924cd6b84fe8106110b673ef5.scope: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: W1208 20:25:05.517027 4781 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63220ecf_e9cb_45c3_b7a6_ac8a631cb22c.slice/crio-7c9115503c2e7d7c25104a8e3be24dbba80b8d8ca0d1ecd56d371c61323d2096.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63220ecf_e9cb_45c3_b7a6_ac8a631cb22c.slice/crio-7c9115503c2e7d7c25104a8e3be24dbba80b8d8ca0d1ecd56d371c61323d2096.scope: no such file or directory Dec 08 20:25:05 crc kubenswrapper[4781]: E1208 20:25:05.782253 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12f129eb_3261_41d7_99e4_a6de53a7fd31.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73418976_c6e9_4ba9_bbd6_145ffa3d5ec3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63220ecf_e9cb_45c3_b7a6_ac8a631cb22c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ad35b7e_59ff_4dd5_85d4_de9c0b62b8ea.slice/crio-95aedae02f2eed831c0d0652f932670ae75eab4c8a22691951caeb65990441d9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73418976_c6e9_4ba9_bbd6_145ffa3d5ec3.slice/crio-cef4a7b5ddd92c6dfbf933ba7eb649f6359a16ae961c0a953f3cca26ac9cb126\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b714ffd_7c31_434d_a833_04abe6c8dcfb.slice/crio-458aad91b280fae6c403fbd40963945d0173a54e50791aa99b5456b5962d7f8c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b714ffd_7c31_434d_a833_04abe6c8dcfb.slice/crio-conmon-458aad91b280fae6c403fbd40963945d0173a54e50791aa99b5456b5962d7f8c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ada136d_baf8_411b_aa50_66edaf44a52b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4532eac0_b6a8_4560_bb45_ba9b78cc4eb4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2627247a_1d1b_4959_8b8f_e8750950b5ec.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ad35b7e_59ff_4dd5_85d4_de9c0b62b8ea.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd06929_88eb_4520_a1e9_97c32d3c2223.slice\": RecentStats: unable to find data in memory cache]" Dec 08 20:25:05 crc kubenswrapper[4781]: I1208 20:25:05.933287 4781 generic.go:334] "Generic (PLEG): container finished" podID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerID="458aad91b280fae6c403fbd40963945d0173a54e50791aa99b5456b5962d7f8c" exitCode=137 Dec 08 20:25:05 crc kubenswrapper[4781]: I1208 20:25:05.933335 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66bd65f4cd-rdlc7" event={"ID":"2b714ffd-7c31-434d-a833-04abe6c8dcfb","Type":"ContainerDied","Data":"458aad91b280fae6c403fbd40963945d0173a54e50791aa99b5456b5962d7f8c"} Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.496375 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.497208 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="ceilometer-central-agent" containerID="cri-o://fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da" gracePeriod=30 Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.497822 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="proxy-httpd" containerID="cri-o://a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf" gracePeriod=30 Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.497885 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="sg-core" containerID="cri-o://8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e" gracePeriod=30 Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.498011 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="ceilometer-notification-agent" containerID="cri-o://de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd" gracePeriod=30 Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.916782 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.987047 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-combined-ca-bundle\") pod \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.987119 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8mpj\" (UniqueName: \"kubernetes.io/projected/2b714ffd-7c31-434d-a833-04abe6c8dcfb-kube-api-access-n8mpj\") pod \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.987165 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-horizon-secret-key\") pod \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.987185 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-horizon-tls-certs\") pod \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.987286 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b714ffd-7c31-434d-a833-04abe6c8dcfb-scripts\") pod \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.987343 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b714ffd-7c31-434d-a833-04abe6c8dcfb-logs\") pod \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.987449 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b714ffd-7c31-434d-a833-04abe6c8dcfb-config-data\") pod \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\" (UID: \"2b714ffd-7c31-434d-a833-04abe6c8dcfb\") " Dec 08 20:25:09 crc kubenswrapper[4781]: I1208 20:25:09.988040 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b714ffd-7c31-434d-a833-04abe6c8dcfb-logs" (OuterVolumeSpecName: "logs") pod "2b714ffd-7c31-434d-a833-04abe6c8dcfb" (UID: "2b714ffd-7c31-434d-a833-04abe6c8dcfb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.039413 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66bd65f4cd-rdlc7" event={"ID":"2b714ffd-7c31-434d-a833-04abe6c8dcfb","Type":"ContainerDied","Data":"9ba381cea379a839b22c5acc1351e225254ba61d93785dd53b1a15e8158c703f"} Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.039475 4781 scope.go:117] "RemoveContainer" containerID="02389c5659f9a03f3983d67ab739b10de654ad7c39804e031b67fa0e1b85821a" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.039676 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66bd65f4cd-rdlc7" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.224707 4781 generic.go:334] "Generic (PLEG): container finished" podID="82caa489-b266-4d63-b903-b88755343da6" containerID="a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf" exitCode=0 Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.224739 4781 generic.go:334] "Generic (PLEG): container finished" podID="82caa489-b266-4d63-b903-b88755343da6" containerID="8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e" exitCode=2 Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.224759 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82caa489-b266-4d63-b903-b88755343da6","Type":"ContainerDied","Data":"a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf"} Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.224782 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82caa489-b266-4d63-b903-b88755343da6","Type":"ContainerDied","Data":"8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e"} Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.424072 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2b714ffd-7c31-434d-a833-04abe6c8dcfb" (UID: "2b714ffd-7c31-434d-a833-04abe6c8dcfb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.425003 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b714ffd-7c31-434d-a833-04abe6c8dcfb-scripts" (OuterVolumeSpecName: "scripts") pod "2b714ffd-7c31-434d-a833-04abe6c8dcfb" (UID: "2b714ffd-7c31-434d-a833-04abe6c8dcfb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.425188 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b714ffd-7c31-434d-a833-04abe6c8dcfb-kube-api-access-n8mpj" (OuterVolumeSpecName: "kube-api-access-n8mpj") pod "2b714ffd-7c31-434d-a833-04abe6c8dcfb" (UID: "2b714ffd-7c31-434d-a833-04abe6c8dcfb"). InnerVolumeSpecName "kube-api-access-n8mpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.426676 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b714ffd-7c31-434d-a833-04abe6c8dcfb-config-data" (OuterVolumeSpecName: "config-data") pod "2b714ffd-7c31-434d-a833-04abe6c8dcfb" (UID: "2b714ffd-7c31-434d-a833-04abe6c8dcfb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.426754 4781 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.426773 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b714ffd-7c31-434d-a833-04abe6c8dcfb-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.426785 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b714ffd-7c31-434d-a833-04abe6c8dcfb-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.426794 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8mpj\" (UniqueName: \"kubernetes.io/projected/2b714ffd-7c31-434d-a833-04abe6c8dcfb-kube-api-access-n8mpj\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.428683 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b714ffd-7c31-434d-a833-04abe6c8dcfb" (UID: "2b714ffd-7c31-434d-a833-04abe6c8dcfb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.458565 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "2b714ffd-7c31-434d-a833-04abe6c8dcfb" (UID: "2b714ffd-7c31-434d-a833-04abe6c8dcfb"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.531015 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b714ffd-7c31-434d-a833-04abe6c8dcfb-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.531051 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.531064 4781 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b714ffd-7c31-434d-a833-04abe6c8dcfb-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.651725 4781 scope.go:117] "RemoveContainer" containerID="458aad91b280fae6c403fbd40963945d0173a54e50791aa99b5456b5962d7f8c" Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.833691 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-66bd65f4cd-rdlc7"] Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.844607 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-66bd65f4cd-rdlc7"] Dec 08 20:25:10 crc kubenswrapper[4781]: I1208 20:25:10.996127 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.144366 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82caa489-b266-4d63-b903-b88755343da6-run-httpd\") pod \"82caa489-b266-4d63-b903-b88755343da6\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.144431 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4svz9\" (UniqueName: \"kubernetes.io/projected/82caa489-b266-4d63-b903-b88755343da6-kube-api-access-4svz9\") pod \"82caa489-b266-4d63-b903-b88755343da6\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.144466 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-config-data\") pod \"82caa489-b266-4d63-b903-b88755343da6\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.144534 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-combined-ca-bundle\") pod \"82caa489-b266-4d63-b903-b88755343da6\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.144578 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-scripts\") pod \"82caa489-b266-4d63-b903-b88755343da6\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.144627 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-sg-core-conf-yaml\") pod \"82caa489-b266-4d63-b903-b88755343da6\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.144684 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82caa489-b266-4d63-b903-b88755343da6-log-httpd\") pod \"82caa489-b266-4d63-b903-b88755343da6\" (UID: \"82caa489-b266-4d63-b903-b88755343da6\") " Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.145228 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82caa489-b266-4d63-b903-b88755343da6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "82caa489-b266-4d63-b903-b88755343da6" (UID: "82caa489-b266-4d63-b903-b88755343da6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.145433 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82caa489-b266-4d63-b903-b88755343da6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "82caa489-b266-4d63-b903-b88755343da6" (UID: "82caa489-b266-4d63-b903-b88755343da6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.145797 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82caa489-b266-4d63-b903-b88755343da6-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.146095 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/82caa489-b266-4d63-b903-b88755343da6-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.149285 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82caa489-b266-4d63-b903-b88755343da6-kube-api-access-4svz9" (OuterVolumeSpecName: "kube-api-access-4svz9") pod "82caa489-b266-4d63-b903-b88755343da6" (UID: "82caa489-b266-4d63-b903-b88755343da6"). InnerVolumeSpecName "kube-api-access-4svz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.152296 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-scripts" (OuterVolumeSpecName: "scripts") pod "82caa489-b266-4d63-b903-b88755343da6" (UID: "82caa489-b266-4d63-b903-b88755343da6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.185245 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "82caa489-b266-4d63-b903-b88755343da6" (UID: "82caa489-b266-4d63-b903-b88755343da6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.243012 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82caa489-b266-4d63-b903-b88755343da6" (UID: "82caa489-b266-4d63-b903-b88755343da6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.246236 4781 generic.go:334] "Generic (PLEG): container finished" podID="82caa489-b266-4d63-b903-b88755343da6" containerID="de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd" exitCode=0 Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.246262 4781 generic.go:334] "Generic (PLEG): container finished" podID="82caa489-b266-4d63-b903-b88755343da6" containerID="fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da" exitCode=0 Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.246319 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82caa489-b266-4d63-b903-b88755343da6","Type":"ContainerDied","Data":"de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd"} Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.246353 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82caa489-b266-4d63-b903-b88755343da6","Type":"ContainerDied","Data":"fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da"} Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.246367 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"82caa489-b266-4d63-b903-b88755343da6","Type":"ContainerDied","Data":"6e95ab9cc6205ba990990920881bd0ebff0e38b2dffa0b6a18e80ba66feda775"} Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.246391 4781 scope.go:117] "RemoveContainer" containerID="a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.246573 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.251342 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4svz9\" (UniqueName: \"kubernetes.io/projected/82caa489-b266-4d63-b903-b88755343da6-kube-api-access-4svz9\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.251366 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.251376 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.251385 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.253391 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zrfdd" event={"ID":"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed","Type":"ContainerStarted","Data":"924274b0475ca96e689285170a966a56819134a5e52e7295bc9da44021f4f826"} Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.256125 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-config-data" (OuterVolumeSpecName: "config-data") pod "82caa489-b266-4d63-b903-b88755343da6" (UID: "82caa489-b266-4d63-b903-b88755343da6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.276435 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zrfdd" podStartSLOduration=1.433468549 podStartE2EDuration="11.276413937s" podCreationTimestamp="2025-12-08 20:25:00 +0000 UTC" firstStartedPulling="2025-12-08 20:25:00.848157281 +0000 UTC m=+1216.999440658" lastFinishedPulling="2025-12-08 20:25:10.691102669 +0000 UTC m=+1226.842386046" observedRunningTime="2025-12-08 20:25:11.272788443 +0000 UTC m=+1227.424071960" watchObservedRunningTime="2025-12-08 20:25:11.276413937 +0000 UTC m=+1227.427697314" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.282642 4781 scope.go:117] "RemoveContainer" containerID="8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.318178 4781 scope.go:117] "RemoveContainer" containerID="de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.339676 4781 scope.go:117] "RemoveContainer" containerID="fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.352683 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82caa489-b266-4d63-b903-b88755343da6-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.362760 4781 scope.go:117] "RemoveContainer" containerID="a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf" Dec 08 20:25:11 crc kubenswrapper[4781]: E1208 20:25:11.363243 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf\": container with ID starting with a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf not found: ID does not exist" containerID="a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.363282 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf"} err="failed to get container status \"a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf\": rpc error: code = NotFound desc = could not find container \"a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf\": container with ID starting with a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf not found: ID does not exist" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.363304 4781 scope.go:117] "RemoveContainer" containerID="8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e" Dec 08 20:25:11 crc kubenswrapper[4781]: E1208 20:25:11.363956 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e\": container with ID starting with 8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e not found: ID does not exist" containerID="8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.363984 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e"} err="failed to get container status \"8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e\": rpc error: code = NotFound desc = could not find container \"8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e\": container with ID starting with 8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e not found: ID does not exist" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.364221 4781 scope.go:117] "RemoveContainer" containerID="de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd" Dec 08 20:25:11 crc kubenswrapper[4781]: E1208 20:25:11.364736 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd\": container with ID starting with de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd not found: ID does not exist" containerID="de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.364788 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd"} err="failed to get container status \"de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd\": rpc error: code = NotFound desc = could not find container \"de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd\": container with ID starting with de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd not found: ID does not exist" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.364809 4781 scope.go:117] "RemoveContainer" containerID="fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da" Dec 08 20:25:11 crc kubenswrapper[4781]: E1208 20:25:11.365659 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da\": container with ID starting with fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da not found: ID does not exist" containerID="fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.365718 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da"} err="failed to get container status \"fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da\": rpc error: code = NotFound desc = could not find container \"fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da\": container with ID starting with fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da not found: ID does not exist" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.365753 4781 scope.go:117] "RemoveContainer" containerID="a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.366237 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf"} err="failed to get container status \"a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf\": rpc error: code = NotFound desc = could not find container \"a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf\": container with ID starting with a349168a93e300056576bcc46522de5f66f89daef5ea765f69524c998d0d03bf not found: ID does not exist" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.366270 4781 scope.go:117] "RemoveContainer" containerID="8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.366536 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e"} err="failed to get container status \"8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e\": rpc error: code = NotFound desc = could not find container \"8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e\": container with ID starting with 8a10499c79f0188f3d7715774165a7440e0881b42ce5a300303e4a94b1a20d6e not found: ID does not exist" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.366559 4781 scope.go:117] "RemoveContainer" containerID="de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.366906 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd"} err="failed to get container status \"de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd\": rpc error: code = NotFound desc = could not find container \"de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd\": container with ID starting with de5e4baf831c1c9a874b9017f7fd0ec0439d09106a94a03866778e38ded86dfd not found: ID does not exist" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.366944 4781 scope.go:117] "RemoveContainer" containerID="fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.367497 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da"} err="failed to get container status \"fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da\": rpc error: code = NotFound desc = could not find container \"fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da\": container with ID starting with fce6602270b00ff26e30d6a7903dc27424df11cf00b7a42272d4e09d90c616da not found: ID does not exist" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.581705 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.592682 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.604556 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:25:11 crc kubenswrapper[4781]: E1208 20:25:11.605013 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="proxy-httpd" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.605034 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="proxy-httpd" Dec 08 20:25:11 crc kubenswrapper[4781]: E1208 20:25:11.605056 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="ceilometer-notification-agent" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.605065 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="ceilometer-notification-agent" Dec 08 20:25:11 crc kubenswrapper[4781]: E1208 20:25:11.605081 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerName="horizon" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.605091 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerName="horizon" Dec 08 20:25:11 crc kubenswrapper[4781]: E1208 20:25:11.605105 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="sg-core" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.605112 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="sg-core" Dec 08 20:25:11 crc kubenswrapper[4781]: E1208 20:25:11.605129 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerName="horizon-log" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.605137 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerName="horizon-log" Dec 08 20:25:11 crc kubenswrapper[4781]: E1208 20:25:11.605157 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="ceilometer-central-agent" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.605167 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="ceilometer-central-agent" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.605357 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerName="horizon" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.605375 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="sg-core" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.605395 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="ceilometer-notification-agent" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.605409 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="proxy-httpd" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.605431 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" containerName="horizon-log" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.605444 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="82caa489-b266-4d63-b903-b88755343da6" containerName="ceilometer-central-agent" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.608899 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.612446 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.612883 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.615087 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.758866 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.758913 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b9f797f-62e5-46e8-af49-6e5b9229d315-log-httpd\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.758966 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-scripts\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.758989 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsbqc\" (UniqueName: \"kubernetes.io/projected/8b9f797f-62e5-46e8-af49-6e5b9229d315-kube-api-access-jsbqc\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.759072 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.759181 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b9f797f-62e5-46e8-af49-6e5b9229d315-run-httpd\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.759245 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-config-data\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.809027 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.809599 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" containerName="glance-log" containerID="cri-o://429298e82ab154c050eff815298b372a178b8e6a29f6a4b2ae8880707dd35c3f" gracePeriod=30 Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.809653 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" containerName="glance-httpd" containerID="cri-o://a4465a3460b996b53f4540724ce4aa04411dca5181d9568b333dadcf9166e0af" gracePeriod=30 Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.860470 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-scripts\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.860875 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsbqc\" (UniqueName: \"kubernetes.io/projected/8b9f797f-62e5-46e8-af49-6e5b9229d315-kube-api-access-jsbqc\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.861048 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.861148 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b9f797f-62e5-46e8-af49-6e5b9229d315-run-httpd\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.861220 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-config-data\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.861282 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.861328 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b9f797f-62e5-46e8-af49-6e5b9229d315-log-httpd\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.861918 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b9f797f-62e5-46e8-af49-6e5b9229d315-run-httpd\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.862043 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b9f797f-62e5-46e8-af49-6e5b9229d315-log-httpd\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.864935 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.866188 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-config-data\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.866632 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-scripts\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.867127 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.877744 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsbqc\" (UniqueName: \"kubernetes.io/projected/8b9f797f-62e5-46e8-af49-6e5b9229d315-kube-api-access-jsbqc\") pod \"ceilometer-0\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " pod="openstack/ceilometer-0" Dec 08 20:25:11 crc kubenswrapper[4781]: I1208 20:25:11.928775 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:25:12 crc kubenswrapper[4781]: I1208 20:25:12.142839 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b714ffd-7c31-434d-a833-04abe6c8dcfb" path="/var/lib/kubelet/pods/2b714ffd-7c31-434d-a833-04abe6c8dcfb/volumes" Dec 08 20:25:12 crc kubenswrapper[4781]: I1208 20:25:12.143823 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82caa489-b266-4d63-b903-b88755343da6" path="/var/lib/kubelet/pods/82caa489-b266-4d63-b903-b88755343da6/volumes" Dec 08 20:25:12 crc kubenswrapper[4781]: I1208 20:25:12.263313 4781 generic.go:334] "Generic (PLEG): container finished" podID="1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" containerID="429298e82ab154c050eff815298b372a178b8e6a29f6a4b2ae8880707dd35c3f" exitCode=143 Dec 08 20:25:12 crc kubenswrapper[4781]: I1208 20:25:12.263393 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373","Type":"ContainerDied","Data":"429298e82ab154c050eff815298b372a178b8e6a29f6a4b2ae8880707dd35c3f"} Dec 08 20:25:12 crc kubenswrapper[4781]: I1208 20:25:12.354366 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:25:12 crc kubenswrapper[4781]: W1208 20:25:12.359871 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b9f797f_62e5_46e8_af49_6e5b9229d315.slice/crio-a9941dbba774a2691fc1e0967747da8280c01817afd7c1d0089e17faefd5fa5e WatchSource:0}: Error finding container a9941dbba774a2691fc1e0967747da8280c01817afd7c1d0089e17faefd5fa5e: Status 404 returned error can't find the container with id a9941dbba774a2691fc1e0967747da8280c01817afd7c1d0089e17faefd5fa5e Dec 08 20:25:12 crc kubenswrapper[4781]: I1208 20:25:12.561677 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:25:12 crc kubenswrapper[4781]: I1208 20:25:12.561977 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="34bc4529-0766-4d9b-ad5c-5b78604ebb10" containerName="glance-log" containerID="cri-o://2e145e1de2dd3ba34b7aa77b1adeb4e8a2412fa4b01e564867cea4cd041a30ef" gracePeriod=30 Dec 08 20:25:12 crc kubenswrapper[4781]: I1208 20:25:12.562092 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="34bc4529-0766-4d9b-ad5c-5b78604ebb10" containerName="glance-httpd" containerID="cri-o://1899ec1b6723e30dbfcfaa08dde9195c7ec8ebade14dc87e0f6c6f4be7dbaa10" gracePeriod=30 Dec 08 20:25:13 crc kubenswrapper[4781]: I1208 20:25:13.278947 4781 generic.go:334] "Generic (PLEG): container finished" podID="34bc4529-0766-4d9b-ad5c-5b78604ebb10" containerID="2e145e1de2dd3ba34b7aa77b1adeb4e8a2412fa4b01e564867cea4cd041a30ef" exitCode=143 Dec 08 20:25:13 crc kubenswrapper[4781]: I1208 20:25:13.279100 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34bc4529-0766-4d9b-ad5c-5b78604ebb10","Type":"ContainerDied","Data":"2e145e1de2dd3ba34b7aa77b1adeb4e8a2412fa4b01e564867cea4cd041a30ef"} Dec 08 20:25:13 crc kubenswrapper[4781]: I1208 20:25:13.281048 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b9f797f-62e5-46e8-af49-6e5b9229d315","Type":"ContainerStarted","Data":"6724de4214475f0662c149db1be0c5c8742590f7a933a9083989811ced9b8f90"} Dec 08 20:25:13 crc kubenswrapper[4781]: I1208 20:25:13.281086 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b9f797f-62e5-46e8-af49-6e5b9229d315","Type":"ContainerStarted","Data":"a9941dbba774a2691fc1e0967747da8280c01817afd7c1d0089e17faefd5fa5e"} Dec 08 20:25:14 crc kubenswrapper[4781]: I1208 20:25:14.291513 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b9f797f-62e5-46e8-af49-6e5b9229d315","Type":"ContainerStarted","Data":"16c03849d5810ba98bfe2469ac1331189215795fb2cf3836301d2254756dd72a"} Dec 08 20:25:14 crc kubenswrapper[4781]: I1208 20:25:14.543224 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.308712 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b9f797f-62e5-46e8-af49-6e5b9229d315","Type":"ContainerStarted","Data":"328abc95cf915fbba0b04658c64cfbdcca35856cda056306eef7054bfb77e542"} Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.315713 4781 generic.go:334] "Generic (PLEG): container finished" podID="1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" containerID="a4465a3460b996b53f4540724ce4aa04411dca5181d9568b333dadcf9166e0af" exitCode=0 Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.315771 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373","Type":"ContainerDied","Data":"a4465a3460b996b53f4540724ce4aa04411dca5181d9568b333dadcf9166e0af"} Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.623366 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.668276 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-logs\") pod \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.668350 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xss8\" (UniqueName: \"kubernetes.io/projected/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-kube-api-access-2xss8\") pod \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.668396 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-config-data\") pod \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.668425 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-public-tls-certs\") pod \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.668451 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.668472 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-scripts\") pod \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.668508 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-combined-ca-bundle\") pod \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.668527 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-httpd-run\") pod \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\" (UID: \"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373\") " Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.669769 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" (UID: "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.671327 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-logs" (OuterVolumeSpecName: "logs") pod "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" (UID: "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.685314 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" (UID: "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.693118 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-scripts" (OuterVolumeSpecName: "scripts") pod "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" (UID: "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.711900 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-kube-api-access-2xss8" (OuterVolumeSpecName: "kube-api-access-2xss8") pod "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" (UID: "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373"). InnerVolumeSpecName "kube-api-access-2xss8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.739149 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" (UID: "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.740007 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" (UID: "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.759807 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-config-data" (OuterVolumeSpecName: "config-data") pod "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" (UID: "1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.777357 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.777396 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.777429 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.777442 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.777453 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.777463 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.777473 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.777483 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xss8\" (UniqueName: \"kubernetes.io/projected/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373-kube-api-access-2xss8\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.802477 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 08 20:25:15 crc kubenswrapper[4781]: I1208 20:25:15.879004 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.330316 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b9f797f-62e5-46e8-af49-6e5b9229d315","Type":"ContainerStarted","Data":"b1f860cbe891192ad7c4be33c20eef9a04f978b2ff48655cfe6b2b018c0b9afd"} Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.330693 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.330681 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="ceilometer-central-agent" containerID="cri-o://6724de4214475f0662c149db1be0c5c8742590f7a933a9083989811ced9b8f90" gracePeriod=30 Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.330762 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="sg-core" containerID="cri-o://328abc95cf915fbba0b04658c64cfbdcca35856cda056306eef7054bfb77e542" gracePeriod=30 Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.330818 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="ceilometer-notification-agent" containerID="cri-o://16c03849d5810ba98bfe2469ac1331189215795fb2cf3836301d2254756dd72a" gracePeriod=30 Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.330837 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="proxy-httpd" containerID="cri-o://b1f860cbe891192ad7c4be33c20eef9a04f978b2ff48655cfe6b2b018c0b9afd" gracePeriod=30 Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.340547 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373","Type":"ContainerDied","Data":"4e904a519fd92f06fab30d2bfc6966cc1aebb3a5a46d6d1d70ab5dcaa1700315"} Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.340612 4781 scope.go:117] "RemoveContainer" containerID="a4465a3460b996b53f4540724ce4aa04411dca5181d9568b333dadcf9166e0af" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.340606 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.357681 4781 generic.go:334] "Generic (PLEG): container finished" podID="34bc4529-0766-4d9b-ad5c-5b78604ebb10" containerID="1899ec1b6723e30dbfcfaa08dde9195c7ec8ebade14dc87e0f6c6f4be7dbaa10" exitCode=0 Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.357731 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34bc4529-0766-4d9b-ad5c-5b78604ebb10","Type":"ContainerDied","Data":"1899ec1b6723e30dbfcfaa08dde9195c7ec8ebade14dc87e0f6c6f4be7dbaa10"} Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.368220 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.006053515 podStartE2EDuration="5.368183139s" podCreationTimestamp="2025-12-08 20:25:11 +0000 UTC" firstStartedPulling="2025-12-08 20:25:12.361829954 +0000 UTC m=+1228.513113331" lastFinishedPulling="2025-12-08 20:25:15.723959578 +0000 UTC m=+1231.875242955" observedRunningTime="2025-12-08 20:25:16.359011855 +0000 UTC m=+1232.510295232" watchObservedRunningTime="2025-12-08 20:25:16.368183139 +0000 UTC m=+1232.519466516" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.388946 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.397637 4781 scope.go:117] "RemoveContainer" containerID="429298e82ab154c050eff815298b372a178b8e6a29f6a4b2ae8880707dd35c3f" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.405911 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.414638 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:25:16 crc kubenswrapper[4781]: E1208 20:25:16.415080 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" containerName="glance-httpd" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.415099 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" containerName="glance-httpd" Dec 08 20:25:16 crc kubenswrapper[4781]: E1208 20:25:16.415143 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" containerName="glance-log" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.415150 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" containerName="glance-log" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.415376 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" containerName="glance-httpd" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.415403 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" containerName="glance-log" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.416367 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.424355 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.424476 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.431492 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.507668 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpxsr\" (UniqueName: \"kubernetes.io/projected/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-kube-api-access-tpxsr\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.507739 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.507797 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.507821 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.507856 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.507959 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-scripts\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.507990 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-logs\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.508024 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-config-data\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.618956 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-scripts\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.619328 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-logs\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.619357 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-config-data\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.619406 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpxsr\" (UniqueName: \"kubernetes.io/projected/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-kube-api-access-tpxsr\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.619441 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.619477 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.619496 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.619525 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.620962 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.621245 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-logs\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.621521 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.628278 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-scripts\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.631772 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.634883 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-config-data\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.655820 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.664474 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpxsr\" (UniqueName: \"kubernetes.io/projected/8238f3d8-03ef-4f26-a327-0f9e931aa7a6-kube-api-access-tpxsr\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.713684 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"8238f3d8-03ef-4f26-a327-0f9e931aa7a6\") " pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.775613 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.796350 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.828992 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-combined-ca-bundle\") pod \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.829067 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34bc4529-0766-4d9b-ad5c-5b78604ebb10-logs\") pod \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.829104 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-internal-tls-certs\") pod \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.829138 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-config-data\") pod \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.829185 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.829271 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-scripts\") pod \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.829312 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhdhz\" (UniqueName: \"kubernetes.io/projected/34bc4529-0766-4d9b-ad5c-5b78604ebb10-kube-api-access-lhdhz\") pod \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.829370 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34bc4529-0766-4d9b-ad5c-5b78604ebb10-httpd-run\") pod \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\" (UID: \"34bc4529-0766-4d9b-ad5c-5b78604ebb10\") " Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.830116 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34bc4529-0766-4d9b-ad5c-5b78604ebb10-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "34bc4529-0766-4d9b-ad5c-5b78604ebb10" (UID: "34bc4529-0766-4d9b-ad5c-5b78604ebb10"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.832244 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34bc4529-0766-4d9b-ad5c-5b78604ebb10-logs" (OuterVolumeSpecName: "logs") pod "34bc4529-0766-4d9b-ad5c-5b78604ebb10" (UID: "34bc4529-0766-4d9b-ad5c-5b78604ebb10"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.839967 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "34bc4529-0766-4d9b-ad5c-5b78604ebb10" (UID: "34bc4529-0766-4d9b-ad5c-5b78604ebb10"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.844390 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34bc4529-0766-4d9b-ad5c-5b78604ebb10-kube-api-access-lhdhz" (OuterVolumeSpecName: "kube-api-access-lhdhz") pod "34bc4529-0766-4d9b-ad5c-5b78604ebb10" (UID: "34bc4529-0766-4d9b-ad5c-5b78604ebb10"). InnerVolumeSpecName "kube-api-access-lhdhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.860828 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-scripts" (OuterVolumeSpecName: "scripts") pod "34bc4529-0766-4d9b-ad5c-5b78604ebb10" (UID: "34bc4529-0766-4d9b-ad5c-5b78604ebb10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.913048 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34bc4529-0766-4d9b-ad5c-5b78604ebb10" (UID: "34bc4529-0766-4d9b-ad5c-5b78604ebb10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.931299 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34bc4529-0766-4d9b-ad5c-5b78604ebb10-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.931353 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.931363 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.931373 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhdhz\" (UniqueName: \"kubernetes.io/projected/34bc4529-0766-4d9b-ad5c-5b78604ebb10-kube-api-access-lhdhz\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.931385 4781 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34bc4529-0766-4d9b-ad5c-5b78604ebb10-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.931393 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.945539 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-config-data" (OuterVolumeSpecName: "config-data") pod "34bc4529-0766-4d9b-ad5c-5b78604ebb10" (UID: "34bc4529-0766-4d9b-ad5c-5b78604ebb10"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.962790 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "34bc4529-0766-4d9b-ad5c-5b78604ebb10" (UID: "34bc4529-0766-4d9b-ad5c-5b78604ebb10"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:16 crc kubenswrapper[4781]: I1208 20:25:16.992749 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.040030 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.040065 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bc4529-0766-4d9b-ad5c-5b78604ebb10-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.040076 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.369429 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34bc4529-0766-4d9b-ad5c-5b78604ebb10","Type":"ContainerDied","Data":"486ae9c6afe76de2d63c49388f13fa87f85bc61367d5d5d6f9ffa370ccb98026"} Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.369746 4781 scope.go:117] "RemoveContainer" containerID="1899ec1b6723e30dbfcfaa08dde9195c7ec8ebade14dc87e0f6c6f4be7dbaa10" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.369756 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.385092 4781 generic.go:334] "Generic (PLEG): container finished" podID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerID="b1f860cbe891192ad7c4be33c20eef9a04f978b2ff48655cfe6b2b018c0b9afd" exitCode=0 Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.385135 4781 generic.go:334] "Generic (PLEG): container finished" podID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerID="328abc95cf915fbba0b04658c64cfbdcca35856cda056306eef7054bfb77e542" exitCode=2 Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.385148 4781 generic.go:334] "Generic (PLEG): container finished" podID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerID="16c03849d5810ba98bfe2469ac1331189215795fb2cf3836301d2254756dd72a" exitCode=0 Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.385204 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b9f797f-62e5-46e8-af49-6e5b9229d315","Type":"ContainerDied","Data":"b1f860cbe891192ad7c4be33c20eef9a04f978b2ff48655cfe6b2b018c0b9afd"} Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.385235 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b9f797f-62e5-46e8-af49-6e5b9229d315","Type":"ContainerDied","Data":"328abc95cf915fbba0b04658c64cfbdcca35856cda056306eef7054bfb77e542"} Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.385249 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b9f797f-62e5-46e8-af49-6e5b9229d315","Type":"ContainerDied","Data":"16c03849d5810ba98bfe2469ac1331189215795fb2cf3836301d2254756dd72a"} Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.419421 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.421170 4781 scope.go:117] "RemoveContainer" containerID="2e145e1de2dd3ba34b7aa77b1adeb4e8a2412fa4b01e564867cea4cd041a30ef" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.434272 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.447020 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:25:17 crc kubenswrapper[4781]: E1208 20:25:17.447499 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34bc4529-0766-4d9b-ad5c-5b78604ebb10" containerName="glance-log" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.447522 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="34bc4529-0766-4d9b-ad5c-5b78604ebb10" containerName="glance-log" Dec 08 20:25:17 crc kubenswrapper[4781]: E1208 20:25:17.447567 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34bc4529-0766-4d9b-ad5c-5b78604ebb10" containerName="glance-httpd" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.447575 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="34bc4529-0766-4d9b-ad5c-5b78604ebb10" containerName="glance-httpd" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.447787 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="34bc4529-0766-4d9b-ad5c-5b78604ebb10" containerName="glance-httpd" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.447818 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="34bc4529-0766-4d9b-ad5c-5b78604ebb10" containerName="glance-log" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.449061 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.451024 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.451210 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.460475 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:25:17 crc kubenswrapper[4781]: W1208 20:25:17.505309 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8238f3d8_03ef_4f26_a327_0f9e931aa7a6.slice/crio-daf2c8b0e515c49bf53cafc08a67d02f078ef4afc9661aac2cac61f2b9c94c21 WatchSource:0}: Error finding container daf2c8b0e515c49bf53cafc08a67d02f078ef4afc9661aac2cac61f2b9c94c21: Status 404 returned error can't find the container with id daf2c8b0e515c49bf53cafc08a67d02f078ef4afc9661aac2cac61f2b9c94c21 Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.512893 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.554995 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901250f8-f46c-49d0-83d3-3b0c7affea54-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.555068 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/901250f8-f46c-49d0-83d3-3b0c7affea54-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.555114 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901250f8-f46c-49d0-83d3-3b0c7affea54-logs\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.555146 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/901250f8-f46c-49d0-83d3-3b0c7affea54-scripts\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.555168 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/901250f8-f46c-49d0-83d3-3b0c7affea54-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.555194 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5hlw\" (UniqueName: \"kubernetes.io/projected/901250f8-f46c-49d0-83d3-3b0c7affea54-kube-api-access-f5hlw\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.555400 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901250f8-f46c-49d0-83d3-3b0c7affea54-config-data\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.555606 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.657638 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.658356 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.664294 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901250f8-f46c-49d0-83d3-3b0c7affea54-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.665066 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/901250f8-f46c-49d0-83d3-3b0c7affea54-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.665385 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901250f8-f46c-49d0-83d3-3b0c7affea54-logs\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.665546 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/901250f8-f46c-49d0-83d3-3b0c7affea54-scripts\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.665560 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/901250f8-f46c-49d0-83d3-3b0c7affea54-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.665739 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5hlw\" (UniqueName: \"kubernetes.io/projected/901250f8-f46c-49d0-83d3-3b0c7affea54-kube-api-access-f5hlw\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.666403 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901250f8-f46c-49d0-83d3-3b0c7affea54-config-data\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.666908 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/901250f8-f46c-49d0-83d3-3b0c7affea54-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.669187 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901250f8-f46c-49d0-83d3-3b0c7affea54-logs\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.670512 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/901250f8-f46c-49d0-83d3-3b0c7affea54-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.670734 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901250f8-f46c-49d0-83d3-3b0c7affea54-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.673293 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/901250f8-f46c-49d0-83d3-3b0c7affea54-scripts\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.678399 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901250f8-f46c-49d0-83d3-3b0c7affea54-config-data\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.690286 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5hlw\" (UniqueName: \"kubernetes.io/projected/901250f8-f46c-49d0-83d3-3b0c7affea54-kube-api-access-f5hlw\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.706615 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"901250f8-f46c-49d0-83d3-3b0c7affea54\") " pod="openstack/glance-default-internal-api-0" Dec 08 20:25:17 crc kubenswrapper[4781]: I1208 20:25:17.773271 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 08 20:25:18 crc kubenswrapper[4781]: I1208 20:25:18.141576 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" path="/var/lib/kubelet/pods/1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373/volumes" Dec 08 20:25:18 crc kubenswrapper[4781]: I1208 20:25:18.143723 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34bc4529-0766-4d9b-ad5c-5b78604ebb10" path="/var/lib/kubelet/pods/34bc4529-0766-4d9b-ad5c-5b78604ebb10/volumes" Dec 08 20:25:18 crc kubenswrapper[4781]: I1208 20:25:18.331492 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 08 20:25:18 crc kubenswrapper[4781]: W1208 20:25:18.338177 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod901250f8_f46c_49d0_83d3_3b0c7affea54.slice/crio-c80001b946db76ced929e6dbc440a3db2068afb14083a891f9ac0d9f89c8580a WatchSource:0}: Error finding container c80001b946db76ced929e6dbc440a3db2068afb14083a891f9ac0d9f89c8580a: Status 404 returned error can't find the container with id c80001b946db76ced929e6dbc440a3db2068afb14083a891f9ac0d9f89c8580a Dec 08 20:25:18 crc kubenswrapper[4781]: I1208 20:25:18.409250 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"901250f8-f46c-49d0-83d3-3b0c7affea54","Type":"ContainerStarted","Data":"c80001b946db76ced929e6dbc440a3db2068afb14083a891f9ac0d9f89c8580a"} Dec 08 20:25:18 crc kubenswrapper[4781]: I1208 20:25:18.413124 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8238f3d8-03ef-4f26-a327-0f9e931aa7a6","Type":"ContainerStarted","Data":"b2a535842475c5aa892d03e20d12a60806572980ead64991febcddfa6d90c894"} Dec 08 20:25:18 crc kubenswrapper[4781]: I1208 20:25:18.413495 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8238f3d8-03ef-4f26-a327-0f9e931aa7a6","Type":"ContainerStarted","Data":"daf2c8b0e515c49bf53cafc08a67d02f078ef4afc9661aac2cac61f2b9c94c21"} Dec 08 20:25:19 crc kubenswrapper[4781]: I1208 20:25:19.434139 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8238f3d8-03ef-4f26-a327-0f9e931aa7a6","Type":"ContainerStarted","Data":"cf810a70638ffc25dbcbcf85d198e1e339af6405cdebe42b925baf37c2049ad1"} Dec 08 20:25:19 crc kubenswrapper[4781]: I1208 20:25:19.437653 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"901250f8-f46c-49d0-83d3-3b0c7affea54","Type":"ContainerStarted","Data":"2f169bbdeabf791b4fa2f3c8c282d3411dae1b75f192d28e5addecfb175f07d2"} Dec 08 20:25:19 crc kubenswrapper[4781]: I1208 20:25:19.464941 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.464895817 podStartE2EDuration="3.464895817s" podCreationTimestamp="2025-12-08 20:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:25:19.454492568 +0000 UTC m=+1235.605775955" watchObservedRunningTime="2025-12-08 20:25:19.464895817 +0000 UTC m=+1235.616179194" Dec 08 20:25:20 crc kubenswrapper[4781]: I1208 20:25:20.448241 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"901250f8-f46c-49d0-83d3-3b0c7affea54","Type":"ContainerStarted","Data":"7076f2355755c36c0ff4bc68e95b1d7c0423fb2690412635f692c360f9a6f4b5"} Dec 08 20:25:20 crc kubenswrapper[4781]: I1208 20:25:20.477230 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.477203814 podStartE2EDuration="3.477203814s" podCreationTimestamp="2025-12-08 20:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:25:20.469017539 +0000 UTC m=+1236.620300926" watchObservedRunningTime="2025-12-08 20:25:20.477203814 +0000 UTC m=+1236.628487211" Dec 08 20:25:22 crc kubenswrapper[4781]: I1208 20:25:22.466516 4781 generic.go:334] "Generic (PLEG): container finished" podID="38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed" containerID="924274b0475ca96e689285170a966a56819134a5e52e7295bc9da44021f4f826" exitCode=0 Dec 08 20:25:22 crc kubenswrapper[4781]: I1208 20:25:22.466654 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zrfdd" event={"ID":"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed","Type":"ContainerDied","Data":"924274b0475ca96e689285170a966a56819134a5e52e7295bc9da44021f4f826"} Dec 08 20:25:23 crc kubenswrapper[4781]: I1208 20:25:23.784621 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:23 crc kubenswrapper[4781]: I1208 20:25:23.896016 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-scripts\") pod \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " Dec 08 20:25:23 crc kubenswrapper[4781]: I1208 20:25:23.896384 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-config-data\") pod \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " Dec 08 20:25:23 crc kubenswrapper[4781]: I1208 20:25:23.896551 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wcx9\" (UniqueName: \"kubernetes.io/projected/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-kube-api-access-6wcx9\") pod \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " Dec 08 20:25:23 crc kubenswrapper[4781]: I1208 20:25:23.896576 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-combined-ca-bundle\") pod \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\" (UID: \"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed\") " Dec 08 20:25:23 crc kubenswrapper[4781]: I1208 20:25:23.904072 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-scripts" (OuterVolumeSpecName: "scripts") pod "38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed" (UID: "38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:23 crc kubenswrapper[4781]: I1208 20:25:23.904114 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-kube-api-access-6wcx9" (OuterVolumeSpecName: "kube-api-access-6wcx9") pod "38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed" (UID: "38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed"). InnerVolumeSpecName "kube-api-access-6wcx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:25:23 crc kubenswrapper[4781]: I1208 20:25:23.928176 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed" (UID: "38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:23 crc kubenswrapper[4781]: I1208 20:25:23.928588 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-config-data" (OuterVolumeSpecName: "config-data") pod "38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed" (UID: "38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:23 crc kubenswrapper[4781]: I1208 20:25:23.998299 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wcx9\" (UniqueName: \"kubernetes.io/projected/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-kube-api-access-6wcx9\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:23 crc kubenswrapper[4781]: I1208 20:25:23.998330 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:23 crc kubenswrapper[4781]: I1208 20:25:23.998342 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:23 crc kubenswrapper[4781]: I1208 20:25:23.998353 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.486804 4781 generic.go:334] "Generic (PLEG): container finished" podID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerID="6724de4214475f0662c149db1be0c5c8742590f7a933a9083989811ced9b8f90" exitCode=0 Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.487246 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b9f797f-62e5-46e8-af49-6e5b9229d315","Type":"ContainerDied","Data":"6724de4214475f0662c149db1be0c5c8742590f7a933a9083989811ced9b8f90"} Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.488999 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zrfdd" event={"ID":"38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed","Type":"ContainerDied","Data":"422f1b34bd189d3cdc81779501e69f9126939e8dbdfa1b04bdfb6bf3f1d7faca"} Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.489066 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="422f1b34bd189d3cdc81779501e69f9126939e8dbdfa1b04bdfb6bf3f1d7faca" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.489143 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zrfdd" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.557372 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.583075 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 08 20:25:24 crc kubenswrapper[4781]: E1208 20:25:24.583491 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="proxy-httpd" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.583507 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="proxy-httpd" Dec 08 20:25:24 crc kubenswrapper[4781]: E1208 20:25:24.583523 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="ceilometer-notification-agent" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.583531 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="ceilometer-notification-agent" Dec 08 20:25:24 crc kubenswrapper[4781]: E1208 20:25:24.583542 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="ceilometer-central-agent" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.583549 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="ceilometer-central-agent" Dec 08 20:25:24 crc kubenswrapper[4781]: E1208 20:25:24.583561 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="sg-core" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.583568 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="sg-core" Dec 08 20:25:24 crc kubenswrapper[4781]: E1208 20:25:24.583592 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed" containerName="nova-cell0-conductor-db-sync" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.583600 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed" containerName="nova-cell0-conductor-db-sync" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.583779 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="ceilometer-central-agent" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.583794 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="proxy-httpd" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.583802 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="ceilometer-notification-agent" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.583812 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed" containerName="nova-cell0-conductor-db-sync" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.583824 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" containerName="sg-core" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.584430 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.589714 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dnkn2" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.590049 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.593600 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.624507 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-scripts\") pod \"8b9f797f-62e5-46e8-af49-6e5b9229d315\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.624592 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-sg-core-conf-yaml\") pod \"8b9f797f-62e5-46e8-af49-6e5b9229d315\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.624684 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsbqc\" (UniqueName: \"kubernetes.io/projected/8b9f797f-62e5-46e8-af49-6e5b9229d315-kube-api-access-jsbqc\") pod \"8b9f797f-62e5-46e8-af49-6e5b9229d315\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.624727 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-config-data\") pod \"8b9f797f-62e5-46e8-af49-6e5b9229d315\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.624760 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b9f797f-62e5-46e8-af49-6e5b9229d315-log-httpd\") pod \"8b9f797f-62e5-46e8-af49-6e5b9229d315\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.624778 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b9f797f-62e5-46e8-af49-6e5b9229d315-run-httpd\") pod \"8b9f797f-62e5-46e8-af49-6e5b9229d315\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.624838 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-combined-ca-bundle\") pod \"8b9f797f-62e5-46e8-af49-6e5b9229d315\" (UID: \"8b9f797f-62e5-46e8-af49-6e5b9229d315\") " Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.625479 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfvzq\" (UniqueName: \"kubernetes.io/projected/5f31e53b-8234-4f29-bfcf-d3c037103945-kube-api-access-qfvzq\") pod \"nova-cell0-conductor-0\" (UID: \"5f31e53b-8234-4f29-bfcf-d3c037103945\") " pod="openstack/nova-cell0-conductor-0" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.625517 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f31e53b-8234-4f29-bfcf-d3c037103945-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5f31e53b-8234-4f29-bfcf-d3c037103945\") " pod="openstack/nova-cell0-conductor-0" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.625578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f31e53b-8234-4f29-bfcf-d3c037103945-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5f31e53b-8234-4f29-bfcf-d3c037103945\") " pod="openstack/nova-cell0-conductor-0" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.625504 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b9f797f-62e5-46e8-af49-6e5b9229d315-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8b9f797f-62e5-46e8-af49-6e5b9229d315" (UID: "8b9f797f-62e5-46e8-af49-6e5b9229d315"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.625855 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b9f797f-62e5-46e8-af49-6e5b9229d315-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8b9f797f-62e5-46e8-af49-6e5b9229d315" (UID: "8b9f797f-62e5-46e8-af49-6e5b9229d315"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.635073 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9f797f-62e5-46e8-af49-6e5b9229d315-kube-api-access-jsbqc" (OuterVolumeSpecName: "kube-api-access-jsbqc") pod "8b9f797f-62e5-46e8-af49-6e5b9229d315" (UID: "8b9f797f-62e5-46e8-af49-6e5b9229d315"). InnerVolumeSpecName "kube-api-access-jsbqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.635228 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-scripts" (OuterVolumeSpecName: "scripts") pod "8b9f797f-62e5-46e8-af49-6e5b9229d315" (UID: "8b9f797f-62e5-46e8-af49-6e5b9229d315"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.660268 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8b9f797f-62e5-46e8-af49-6e5b9229d315" (UID: "8b9f797f-62e5-46e8-af49-6e5b9229d315"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.713084 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b9f797f-62e5-46e8-af49-6e5b9229d315" (UID: "8b9f797f-62e5-46e8-af49-6e5b9229d315"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.727460 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfvzq\" (UniqueName: \"kubernetes.io/projected/5f31e53b-8234-4f29-bfcf-d3c037103945-kube-api-access-qfvzq\") pod \"nova-cell0-conductor-0\" (UID: \"5f31e53b-8234-4f29-bfcf-d3c037103945\") " pod="openstack/nova-cell0-conductor-0" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.727513 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f31e53b-8234-4f29-bfcf-d3c037103945-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5f31e53b-8234-4f29-bfcf-d3c037103945\") " pod="openstack/nova-cell0-conductor-0" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.727624 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f31e53b-8234-4f29-bfcf-d3c037103945-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5f31e53b-8234-4f29-bfcf-d3c037103945\") " pod="openstack/nova-cell0-conductor-0" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.729268 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsbqc\" (UniqueName: \"kubernetes.io/projected/8b9f797f-62e5-46e8-af49-6e5b9229d315-kube-api-access-jsbqc\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.729839 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b9f797f-62e5-46e8-af49-6e5b9229d315-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.729875 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b9f797f-62e5-46e8-af49-6e5b9229d315-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.729887 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.729898 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.729908 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.730992 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f31e53b-8234-4f29-bfcf-d3c037103945-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5f31e53b-8234-4f29-bfcf-d3c037103945\") " pod="openstack/nova-cell0-conductor-0" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.738870 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-config-data" (OuterVolumeSpecName: "config-data") pod "8b9f797f-62e5-46e8-af49-6e5b9229d315" (UID: "8b9f797f-62e5-46e8-af49-6e5b9229d315"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.740330 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f31e53b-8234-4f29-bfcf-d3c037103945-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5f31e53b-8234-4f29-bfcf-d3c037103945\") " pod="openstack/nova-cell0-conductor-0" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.742649 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfvzq\" (UniqueName: \"kubernetes.io/projected/5f31e53b-8234-4f29-bfcf-d3c037103945-kube-api-access-qfvzq\") pod \"nova-cell0-conductor-0\" (UID: \"5f31e53b-8234-4f29-bfcf-d3c037103945\") " pod="openstack/nova-cell0-conductor-0" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.831383 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b9f797f-62e5-46e8-af49-6e5b9229d315-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:24 crc kubenswrapper[4781]: I1208 20:25:24.947714 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.401971 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 08 20:25:25 crc kubenswrapper[4781]: W1208 20:25:25.408379 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f31e53b_8234_4f29_bfcf_d3c037103945.slice/crio-4f8fb92fe92f86050034777f9865c7e8a6efdd949365418731283ca9d1464c09 WatchSource:0}: Error finding container 4f8fb92fe92f86050034777f9865c7e8a6efdd949365418731283ca9d1464c09: Status 404 returned error can't find the container with id 4f8fb92fe92f86050034777f9865c7e8a6efdd949365418731283ca9d1464c09 Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.502607 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.502669 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8b9f797f-62e5-46e8-af49-6e5b9229d315","Type":"ContainerDied","Data":"a9941dbba774a2691fc1e0967747da8280c01817afd7c1d0089e17faefd5fa5e"} Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.502736 4781 scope.go:117] "RemoveContainer" containerID="b1f860cbe891192ad7c4be33c20eef9a04f978b2ff48655cfe6b2b018c0b9afd" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.505977 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5f31e53b-8234-4f29-bfcf-d3c037103945","Type":"ContainerStarted","Data":"4f8fb92fe92f86050034777f9865c7e8a6efdd949365418731283ca9d1464c09"} Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.522086 4781 scope.go:117] "RemoveContainer" containerID="328abc95cf915fbba0b04658c64cfbdcca35856cda056306eef7054bfb77e542" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.557132 4781 scope.go:117] "RemoveContainer" containerID="16c03849d5810ba98bfe2469ac1331189215795fb2cf3836301d2254756dd72a" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.559687 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.578845 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.588272 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.590632 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.593286 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.593470 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.594935 4781 scope.go:117] "RemoveContainer" containerID="6724de4214475f0662c149db1be0c5c8742590f7a933a9083989811ced9b8f90" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.600276 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.648874 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0bee9e1-3aa3-44d2-891e-65dbb171270b-log-httpd\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.648936 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.648969 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-config-data\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.649070 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-scripts\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.649301 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.649329 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0bee9e1-3aa3-44d2-891e-65dbb171270b-run-httpd\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.649424 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p5qx\" (UniqueName: \"kubernetes.io/projected/a0bee9e1-3aa3-44d2-891e-65dbb171270b-kube-api-access-9p5qx\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.751005 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.751062 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0bee9e1-3aa3-44d2-891e-65dbb171270b-run-httpd\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.751125 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p5qx\" (UniqueName: \"kubernetes.io/projected/a0bee9e1-3aa3-44d2-891e-65dbb171270b-kube-api-access-9p5qx\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.751194 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0bee9e1-3aa3-44d2-891e-65dbb171270b-log-httpd\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.751220 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.751269 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-config-data\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.751293 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-scripts\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.753140 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0bee9e1-3aa3-44d2-891e-65dbb171270b-log-httpd\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.753286 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0bee9e1-3aa3-44d2-891e-65dbb171270b-run-httpd\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.756609 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-scripts\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.756784 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.757307 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.761068 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-config-data\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.769624 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p5qx\" (UniqueName: \"kubernetes.io/projected/a0bee9e1-3aa3-44d2-891e-65dbb171270b-kube-api-access-9p5qx\") pod \"ceilometer-0\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " pod="openstack/ceilometer-0" Dec 08 20:25:25 crc kubenswrapper[4781]: I1208 20:25:25.909203 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:25:26 crc kubenswrapper[4781]: I1208 20:25:26.144009 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9f797f-62e5-46e8-af49-6e5b9229d315" path="/var/lib/kubelet/pods/8b9f797f-62e5-46e8-af49-6e5b9229d315/volumes" Dec 08 20:25:26 crc kubenswrapper[4781]: I1208 20:25:26.347654 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:25:26 crc kubenswrapper[4781]: W1208 20:25:26.356135 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bee9e1_3aa3_44d2_891e_65dbb171270b.slice/crio-a47810d3fa8de74454b626e7fdbee857b61ec60498a9cb0d988969a4092d84b9 WatchSource:0}: Error finding container a47810d3fa8de74454b626e7fdbee857b61ec60498a9cb0d988969a4092d84b9: Status 404 returned error can't find the container with id a47810d3fa8de74454b626e7fdbee857b61ec60498a9cb0d988969a4092d84b9 Dec 08 20:25:26 crc kubenswrapper[4781]: I1208 20:25:26.525995 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5f31e53b-8234-4f29-bfcf-d3c037103945","Type":"ContainerStarted","Data":"f5bb0b78ccbf9464587c48f80538975b78167352977ad40f166741a6c15cbc47"} Dec 08 20:25:26 crc kubenswrapper[4781]: I1208 20:25:26.527125 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 08 20:25:26 crc kubenswrapper[4781]: I1208 20:25:26.528726 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0bee9e1-3aa3-44d2-891e-65dbb171270b","Type":"ContainerStarted","Data":"a47810d3fa8de74454b626e7fdbee857b61ec60498a9cb0d988969a4092d84b9"} Dec 08 20:25:26 crc kubenswrapper[4781]: I1208 20:25:26.547196 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.5471793529999998 podStartE2EDuration="2.547179353s" podCreationTimestamp="2025-12-08 20:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:25:26.541324075 +0000 UTC m=+1242.692607482" watchObservedRunningTime="2025-12-08 20:25:26.547179353 +0000 UTC m=+1242.698462730" Dec 08 20:25:26 crc kubenswrapper[4781]: I1208 20:25:26.797356 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 08 20:25:26 crc kubenswrapper[4781]: I1208 20:25:26.797695 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 08 20:25:26 crc kubenswrapper[4781]: I1208 20:25:26.834969 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 08 20:25:26 crc kubenswrapper[4781]: I1208 20:25:26.867276 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 08 20:25:27 crc kubenswrapper[4781]: I1208 20:25:27.538944 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0bee9e1-3aa3-44d2-891e-65dbb171270b","Type":"ContainerStarted","Data":"06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2"} Dec 08 20:25:27 crc kubenswrapper[4781]: I1208 20:25:27.540066 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 08 20:25:27 crc kubenswrapper[4781]: I1208 20:25:27.540084 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 08 20:25:27 crc kubenswrapper[4781]: I1208 20:25:27.774968 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 08 20:25:27 crc kubenswrapper[4781]: I1208 20:25:27.775033 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 08 20:25:27 crc kubenswrapper[4781]: I1208 20:25:27.805505 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 08 20:25:27 crc kubenswrapper[4781]: I1208 20:25:27.817326 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 08 20:25:28 crc kubenswrapper[4781]: I1208 20:25:28.548572 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0bee9e1-3aa3-44d2-891e-65dbb171270b","Type":"ContainerStarted","Data":"f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887"} Dec 08 20:25:28 crc kubenswrapper[4781]: I1208 20:25:28.548971 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 08 20:25:28 crc kubenswrapper[4781]: I1208 20:25:28.548998 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 08 20:25:28 crc kubenswrapper[4781]: I1208 20:25:28.549019 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0bee9e1-3aa3-44d2-891e-65dbb171270b","Type":"ContainerStarted","Data":"78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa"} Dec 08 20:25:29 crc kubenswrapper[4781]: I1208 20:25:29.558197 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 20:25:29 crc kubenswrapper[4781]: I1208 20:25:29.559635 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 20:25:29 crc kubenswrapper[4781]: I1208 20:25:29.569762 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 08 20:25:29 crc kubenswrapper[4781]: I1208 20:25:29.619565 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 08 20:25:29 crc kubenswrapper[4781]: I1208 20:25:29.948306 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:25:29 crc kubenswrapper[4781]: I1208 20:25:29.948375 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:25:30 crc kubenswrapper[4781]: I1208 20:25:30.505546 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 08 20:25:30 crc kubenswrapper[4781]: I1208 20:25:30.573078 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0bee9e1-3aa3-44d2-891e-65dbb171270b","Type":"ContainerStarted","Data":"1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4"} Dec 08 20:25:30 crc kubenswrapper[4781]: I1208 20:25:30.573172 4781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 08 20:25:30 crc kubenswrapper[4781]: I1208 20:25:30.573897 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 20:25:30 crc kubenswrapper[4781]: I1208 20:25:30.574869 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 08 20:25:30 crc kubenswrapper[4781]: I1208 20:25:30.601602 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.682540955 podStartE2EDuration="5.601580879s" podCreationTimestamp="2025-12-08 20:25:25 +0000 UTC" firstStartedPulling="2025-12-08 20:25:26.360122808 +0000 UTC m=+1242.511406185" lastFinishedPulling="2025-12-08 20:25:29.279162732 +0000 UTC m=+1245.430446109" observedRunningTime="2025-12-08 20:25:30.59744384 +0000 UTC m=+1246.748727217" watchObservedRunningTime="2025-12-08 20:25:30.601580879 +0000 UTC m=+1246.752864256" Dec 08 20:25:34 crc kubenswrapper[4781]: I1208 20:25:34.975159 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.519156 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pkxxp"] Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.520346 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.522058 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.522525 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.538992 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pkxxp"] Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.667262 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pkxxp\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.667328 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-scripts\") pod \"nova-cell0-cell-mapping-pkxxp\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.667449 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-config-data\") pod \"nova-cell0-cell-mapping-pkxxp\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.667512 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhx4b\" (UniqueName: \"kubernetes.io/projected/81d1fdcf-4a08-4189-bea7-3e5399286272-kube-api-access-jhx4b\") pod \"nova-cell0-cell-mapping-pkxxp\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.704512 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.706323 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.714697 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.717303 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.718709 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.724398 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.736358 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.770967 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-config-data\") pod \"nova-cell0-cell-mapping-pkxxp\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.771049 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhx4b\" (UniqueName: \"kubernetes.io/projected/81d1fdcf-4a08-4189-bea7-3e5399286272-kube-api-access-jhx4b\") pod \"nova-cell0-cell-mapping-pkxxp\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.771111 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pkxxp\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.771134 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-scripts\") pod \"nova-cell0-cell-mapping-pkxxp\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.776886 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.788695 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pkxxp\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.794698 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-config-data\") pod \"nova-cell0-cell-mapping-pkxxp\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.799690 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-scripts\") pod \"nova-cell0-cell-mapping-pkxxp\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.803788 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhx4b\" (UniqueName: \"kubernetes.io/projected/81d1fdcf-4a08-4189-bea7-3e5399286272-kube-api-access-jhx4b\") pod \"nova-cell0-cell-mapping-pkxxp\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.851760 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.859807 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.882590 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.889546 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e30c528-158d-438a-8d03-a26e98107b90-config-data\") pod \"nova-scheduler-0\" (UID: \"0e30c528-158d-438a-8d03-a26e98107b90\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.889632 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnfrp\" (UniqueName: \"kubernetes.io/projected/c978c9d3-9d38-4185-bc0a-e454c11412cb-kube-api-access-nnfrp\") pod \"nova-api-0\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " pod="openstack/nova-api-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.889655 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e30c528-158d-438a-8d03-a26e98107b90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e30c528-158d-438a-8d03-a26e98107b90\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.889675 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c978c9d3-9d38-4185-bc0a-e454c11412cb-logs\") pod \"nova-api-0\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " pod="openstack/nova-api-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.889749 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c978c9d3-9d38-4185-bc0a-e454c11412cb-config-data\") pod \"nova-api-0\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " pod="openstack/nova-api-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.889773 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c978c9d3-9d38-4185-bc0a-e454c11412cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " pod="openstack/nova-api-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.889838 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8s2x\" (UniqueName: \"kubernetes.io/projected/0e30c528-158d-438a-8d03-a26e98107b90-kube-api-access-h8s2x\") pod \"nova-scheduler-0\" (UID: \"0e30c528-158d-438a-8d03-a26e98107b90\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.905439 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.918026 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.996757 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c978c9d3-9d38-4185-bc0a-e454c11412cb-config-data\") pod \"nova-api-0\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " pod="openstack/nova-api-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.996801 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c978c9d3-9d38-4185-bc0a-e454c11412cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " pod="openstack/nova-api-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.996869 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8s2x\" (UniqueName: \"kubernetes.io/projected/0e30c528-158d-438a-8d03-a26e98107b90-kube-api-access-h8s2x\") pod \"nova-scheduler-0\" (UID: \"0e30c528-158d-438a-8d03-a26e98107b90\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.996900 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4851cd-eb2c-49d7-927a-38b736ca36df-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb4851cd-eb2c-49d7-927a-38b736ca36df\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.996965 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4851cd-eb2c-49d7-927a-38b736ca36df-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb4851cd-eb2c-49d7-927a-38b736ca36df\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.997003 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e30c528-158d-438a-8d03-a26e98107b90-config-data\") pod \"nova-scheduler-0\" (UID: \"0e30c528-158d-438a-8d03-a26e98107b90\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.997037 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c978c9d3-9d38-4185-bc0a-e454c11412cb-logs\") pod \"nova-api-0\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " pod="openstack/nova-api-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.997053 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnfrp\" (UniqueName: \"kubernetes.io/projected/c978c9d3-9d38-4185-bc0a-e454c11412cb-kube-api-access-nnfrp\") pod \"nova-api-0\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " pod="openstack/nova-api-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.997072 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e30c528-158d-438a-8d03-a26e98107b90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e30c528-158d-438a-8d03-a26e98107b90\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:35 crc kubenswrapper[4781]: I1208 20:25:35.997093 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r552\" (UniqueName: \"kubernetes.io/projected/bb4851cd-eb2c-49d7-927a-38b736ca36df-kube-api-access-4r552\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb4851cd-eb2c-49d7-927a-38b736ca36df\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.028239 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c978c9d3-9d38-4185-bc0a-e454c11412cb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " pod="openstack/nova-api-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.031474 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c978c9d3-9d38-4185-bc0a-e454c11412cb-logs\") pod \"nova-api-0\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " pod="openstack/nova-api-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.050378 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c978c9d3-9d38-4185-bc0a-e454c11412cb-config-data\") pod \"nova-api-0\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " pod="openstack/nova-api-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.050876 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e30c528-158d-438a-8d03-a26e98107b90-config-data\") pod \"nova-scheduler-0\" (UID: \"0e30c528-158d-438a-8d03-a26e98107b90\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.081110 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e30c528-158d-438a-8d03-a26e98107b90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e30c528-158d-438a-8d03-a26e98107b90\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.099587 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4851cd-eb2c-49d7-927a-38b736ca36df-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb4851cd-eb2c-49d7-927a-38b736ca36df\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.099631 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4851cd-eb2c-49d7-927a-38b736ca36df-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb4851cd-eb2c-49d7-927a-38b736ca36df\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.099685 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r552\" (UniqueName: \"kubernetes.io/projected/bb4851cd-eb2c-49d7-927a-38b736ca36df-kube-api-access-4r552\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb4851cd-eb2c-49d7-927a-38b736ca36df\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.107450 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4851cd-eb2c-49d7-927a-38b736ca36df-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb4851cd-eb2c-49d7-927a-38b736ca36df\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.108363 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4851cd-eb2c-49d7-927a-38b736ca36df-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb4851cd-eb2c-49d7-927a-38b736ca36df\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.110605 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8s2x\" (UniqueName: \"kubernetes.io/projected/0e30c528-158d-438a-8d03-a26e98107b90-kube-api-access-h8s2x\") pod \"nova-scheduler-0\" (UID: \"0e30c528-158d-438a-8d03-a26e98107b90\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.121846 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnfrp\" (UniqueName: \"kubernetes.io/projected/c978c9d3-9d38-4185-bc0a-e454c11412cb-kube-api-access-nnfrp\") pod \"nova-api-0\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " pod="openstack/nova-api-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.133613 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r552\" (UniqueName: \"kubernetes.io/projected/bb4851cd-eb2c-49d7-927a-38b736ca36df-kube-api-access-4r552\") pod \"nova-cell1-novncproxy-0\" (UID: \"bb4851cd-eb2c-49d7-927a-38b736ca36df\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.159828 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.161454 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.174338 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.177706 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.178361 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.258798 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-hf9gt"] Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.265362 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.305456 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-config-data\") pod \"nova-metadata-0\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.305614 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-logs\") pod \"nova-metadata-0\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.305656 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.305728 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwx49\" (UniqueName: \"kubernetes.io/projected/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-kube-api-access-rwx49\") pod \"nova-metadata-0\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.306559 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-hf9gt"] Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.347689 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.363795 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.407568 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwx49\" (UniqueName: \"kubernetes.io/projected/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-kube-api-access-rwx49\") pod \"nova-metadata-0\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.407648 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.407714 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.407806 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-config-data\") pod \"nova-metadata-0\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.407865 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-dns-svc\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.407892 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-config\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.413899 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbh5g\" (UniqueName: \"kubernetes.io/projected/89c523b0-35de-4752-88b4-3493cf0502b4-kube-api-access-bbh5g\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.413959 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-dns-swift-storage-0\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.413999 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-logs\") pod \"nova-metadata-0\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.414041 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.415200 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-logs\") pod \"nova-metadata-0\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.418586 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-config-data\") pod \"nova-metadata-0\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.419587 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.425203 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwx49\" (UniqueName: \"kubernetes.io/projected/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-kube-api-access-rwx49\") pod \"nova-metadata-0\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.529081 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.529161 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.529266 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-dns-svc\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.529283 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-config\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.529314 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbh5g\" (UniqueName: \"kubernetes.io/projected/89c523b0-35de-4752-88b4-3493cf0502b4-kube-api-access-bbh5g\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.529333 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-dns-swift-storage-0\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.530310 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-dns-swift-storage-0\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.530821 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.531375 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.531895 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-dns-svc\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.552304 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.570082 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-config\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.574663 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbh5g\" (UniqueName: \"kubernetes.io/projected/89c523b0-35de-4752-88b4-3493cf0502b4-kube-api-access-bbh5g\") pod \"dnsmasq-dns-5c4475fdfc-hf9gt\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.676074 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:36 crc kubenswrapper[4781]: I1208 20:25:36.742794 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pkxxp"] Dec 08 20:25:36 crc kubenswrapper[4781]: W1208 20:25:36.798612 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81d1fdcf_4a08_4189_bea7_3e5399286272.slice/crio-3cbb70bd8d9756ec48bd905e558b5a18d331f86739f0682ebb727239c3a858f2 WatchSource:0}: Error finding container 3cbb70bd8d9756ec48bd905e558b5a18d331f86739f0682ebb727239c3a858f2: Status 404 returned error can't find the container with id 3cbb70bd8d9756ec48bd905e558b5a18d331f86739f0682ebb727239c3a858f2 Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.156739 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sqmzb"] Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.158641 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.162398 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.163026 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 08 20:25:37 crc kubenswrapper[4781]: W1208 20:25:37.166684 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb4851cd_eb2c_49d7_927a_38b736ca36df.slice/crio-8439ae47cd14c26244e7064e2405fe5656569eae1f4578cf4f7695d214ed8049 WatchSource:0}: Error finding container 8439ae47cd14c26244e7064e2405fe5656569eae1f4578cf4f7695d214ed8049: Status 404 returned error can't find the container with id 8439ae47cd14c26244e7064e2405fe5656569eae1f4578cf4f7695d214ed8049 Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.174029 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sqmzb"] Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.185377 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.292696 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-config-data\") pod \"nova-cell1-conductor-db-sync-sqmzb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.292782 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sqmzb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.292851 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-scripts\") pod \"nova-cell1-conductor-db-sync-sqmzb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.292875 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t27kz\" (UniqueName: \"kubernetes.io/projected/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-kube-api-access-t27kz\") pod \"nova-cell1-conductor-db-sync-sqmzb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.322037 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:37 crc kubenswrapper[4781]: W1208 20:25:37.351213 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode882e0ce_8e47_44af_9f43_b68f9cfdd2fa.slice/crio-bd8bfbc9dadc395a93b4e950808e01cbc7c28a77e88400bd0af101183d5a668d WatchSource:0}: Error finding container bd8bfbc9dadc395a93b4e950808e01cbc7c28a77e88400bd0af101183d5a668d: Status 404 returned error can't find the container with id bd8bfbc9dadc395a93b4e950808e01cbc7c28a77e88400bd0af101183d5a668d Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.394707 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-scripts\") pod \"nova-cell1-conductor-db-sync-sqmzb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.394747 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t27kz\" (UniqueName: \"kubernetes.io/projected/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-kube-api-access-t27kz\") pod \"nova-cell1-conductor-db-sync-sqmzb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.394880 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-config-data\") pod \"nova-cell1-conductor-db-sync-sqmzb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.394906 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sqmzb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.403693 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sqmzb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.403864 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-config-data\") pod \"nova-cell1-conductor-db-sync-sqmzb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.404739 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-scripts\") pod \"nova-cell1-conductor-db-sync-sqmzb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.412877 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t27kz\" (UniqueName: \"kubernetes.io/projected/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-kube-api-access-t27kz\") pod \"nova-cell1-conductor-db-sync-sqmzb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.514490 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.541342 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.551036 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.563393 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-hf9gt"] Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.717403 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c978c9d3-9d38-4185-bc0a-e454c11412cb","Type":"ContainerStarted","Data":"89016b90ade57f2479f57e042dd98b73ab21aa8a786e6368c07e0b466a83b73b"} Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.725846 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" event={"ID":"89c523b0-35de-4752-88b4-3493cf0502b4","Type":"ContainerStarted","Data":"9b5eb9c74ab697e3aa1a39439ccc11db54a293fbb0917d48cc2ed7695435c2bb"} Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.727417 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e30c528-158d-438a-8d03-a26e98107b90","Type":"ContainerStarted","Data":"3ff8c17b3ca31768b3b0d59e66bb488bad3a503df083d1c24d00f936359ab954"} Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.731354 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa","Type":"ContainerStarted","Data":"bd8bfbc9dadc395a93b4e950808e01cbc7c28a77e88400bd0af101183d5a668d"} Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.738291 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bb4851cd-eb2c-49d7-927a-38b736ca36df","Type":"ContainerStarted","Data":"8439ae47cd14c26244e7064e2405fe5656569eae1f4578cf4f7695d214ed8049"} Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.739534 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pkxxp" event={"ID":"81d1fdcf-4a08-4189-bea7-3e5399286272","Type":"ContainerStarted","Data":"355847e2b51871732e159f6ef626bfcacbe657c99bf6af0223d8dbcdaeaa060c"} Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.739563 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pkxxp" event={"ID":"81d1fdcf-4a08-4189-bea7-3e5399286272","Type":"ContainerStarted","Data":"3cbb70bd8d9756ec48bd905e558b5a18d331f86739f0682ebb727239c3a858f2"} Dec 08 20:25:37 crc kubenswrapper[4781]: I1208 20:25:37.766647 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pkxxp" podStartSLOduration=2.766629032 podStartE2EDuration="2.766629032s" podCreationTimestamp="2025-12-08 20:25:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:25:37.759753335 +0000 UTC m=+1253.911036712" watchObservedRunningTime="2025-12-08 20:25:37.766629032 +0000 UTC m=+1253.917912409" Dec 08 20:25:38 crc kubenswrapper[4781]: W1208 20:25:38.145165 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c1ebe9d_4e7a_44a3_84ae_f7f17ded01bb.slice/crio-72ec510065fc32850b85dce29fc5a073b0977d68208512f71224b132c15235f7 WatchSource:0}: Error finding container 72ec510065fc32850b85dce29fc5a073b0977d68208512f71224b132c15235f7: Status 404 returned error can't find the container with id 72ec510065fc32850b85dce29fc5a073b0977d68208512f71224b132c15235f7 Dec 08 20:25:38 crc kubenswrapper[4781]: I1208 20:25:38.151613 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sqmzb"] Dec 08 20:25:38 crc kubenswrapper[4781]: I1208 20:25:38.748735 4781 generic.go:334] "Generic (PLEG): container finished" podID="89c523b0-35de-4752-88b4-3493cf0502b4" containerID="f634f19c9216173a4016c40deebba4c5235d55d6b0fe28cd68af8bcdefe99281" exitCode=0 Dec 08 20:25:38 crc kubenswrapper[4781]: I1208 20:25:38.748800 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" event={"ID":"89c523b0-35de-4752-88b4-3493cf0502b4","Type":"ContainerDied","Data":"f634f19c9216173a4016c40deebba4c5235d55d6b0fe28cd68af8bcdefe99281"} Dec 08 20:25:38 crc kubenswrapper[4781]: I1208 20:25:38.752194 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sqmzb" event={"ID":"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb","Type":"ContainerStarted","Data":"72ec510065fc32850b85dce29fc5a073b0977d68208512f71224b132c15235f7"} Dec 08 20:25:39 crc kubenswrapper[4781]: I1208 20:25:39.772846 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" event={"ID":"89c523b0-35de-4752-88b4-3493cf0502b4","Type":"ContainerStarted","Data":"5daf54665f713e6ff5780d763f9e11b552ceb9574cd092749fbe79136e8bf76d"} Dec 08 20:25:39 crc kubenswrapper[4781]: I1208 20:25:39.773308 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:39 crc kubenswrapper[4781]: I1208 20:25:39.780749 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sqmzb" event={"ID":"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb","Type":"ContainerStarted","Data":"8bfd56d687f86072d9c9e144cc9403520c08e92f6bd39dd28b0a4aadc0ccb874"} Dec 08 20:25:39 crc kubenswrapper[4781]: I1208 20:25:39.803213 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" podStartSLOduration=3.80319595 podStartE2EDuration="3.80319595s" podCreationTimestamp="2025-12-08 20:25:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:25:39.79866046 +0000 UTC m=+1255.949943847" watchObservedRunningTime="2025-12-08 20:25:39.80319595 +0000 UTC m=+1255.954479327" Dec 08 20:25:39 crc kubenswrapper[4781]: I1208 20:25:39.834801 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-sqmzb" podStartSLOduration=2.834778217 podStartE2EDuration="2.834778217s" podCreationTimestamp="2025-12-08 20:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:25:39.829157236 +0000 UTC m=+1255.980440623" watchObservedRunningTime="2025-12-08 20:25:39.834778217 +0000 UTC m=+1255.986061594" Dec 08 20:25:39 crc kubenswrapper[4781]: I1208 20:25:39.854669 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:39 crc kubenswrapper[4781]: I1208 20:25:39.867738 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 20:25:41 crc kubenswrapper[4781]: I1208 20:25:41.800715 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e30c528-158d-438a-8d03-a26e98107b90","Type":"ContainerStarted","Data":"e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd"} Dec 08 20:25:41 crc kubenswrapper[4781]: I1208 20:25:41.802976 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bb4851cd-eb2c-49d7-927a-38b736ca36df","Type":"ContainerStarted","Data":"7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12"} Dec 08 20:25:41 crc kubenswrapper[4781]: I1208 20:25:41.803110 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="bb4851cd-eb2c-49d7-927a-38b736ca36df" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12" gracePeriod=30 Dec 08 20:25:41 crc kubenswrapper[4781]: I1208 20:25:41.806110 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa","Type":"ContainerStarted","Data":"46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785"} Dec 08 20:25:41 crc kubenswrapper[4781]: I1208 20:25:41.808423 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c978c9d3-9d38-4185-bc0a-e454c11412cb","Type":"ContainerStarted","Data":"6ea7c289c27d7253d426e59544f6cd40df7fb39aad538b48126cc0592ac81868"} Dec 08 20:25:41 crc kubenswrapper[4781]: I1208 20:25:41.828839 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.178385124 podStartE2EDuration="6.828820122s" podCreationTimestamp="2025-12-08 20:25:35 +0000 UTC" firstStartedPulling="2025-12-08 20:25:37.643061882 +0000 UTC m=+1253.794345259" lastFinishedPulling="2025-12-08 20:25:41.29349688 +0000 UTC m=+1257.444780257" observedRunningTime="2025-12-08 20:25:41.822607463 +0000 UTC m=+1257.973890840" watchObservedRunningTime="2025-12-08 20:25:41.828820122 +0000 UTC m=+1257.980103499" Dec 08 20:25:41 crc kubenswrapper[4781]: I1208 20:25:41.852258 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.7285669390000002 podStartE2EDuration="6.852239965s" podCreationTimestamp="2025-12-08 20:25:35 +0000 UTC" firstStartedPulling="2025-12-08 20:25:37.168584169 +0000 UTC m=+1253.319867546" lastFinishedPulling="2025-12-08 20:25:41.292257195 +0000 UTC m=+1257.443540572" observedRunningTime="2025-12-08 20:25:41.843971297 +0000 UTC m=+1257.995254674" watchObservedRunningTime="2025-12-08 20:25:41.852239965 +0000 UTC m=+1258.003523342" Dec 08 20:25:42 crc kubenswrapper[4781]: I1208 20:25:42.818384 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa","Type":"ContainerStarted","Data":"010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714"} Dec 08 20:25:42 crc kubenswrapper[4781]: I1208 20:25:42.818534 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" containerName="nova-metadata-metadata" containerID="cri-o://010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714" gracePeriod=30 Dec 08 20:25:42 crc kubenswrapper[4781]: I1208 20:25:42.818582 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" containerName="nova-metadata-log" containerID="cri-o://46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785" gracePeriod=30 Dec 08 20:25:42 crc kubenswrapper[4781]: I1208 20:25:42.823011 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c978c9d3-9d38-4185-bc0a-e454c11412cb","Type":"ContainerStarted","Data":"a772c01eae0bd665e745f0a410760fd00cf88f76ecc71dcb54690d0809af4196"} Dec 08 20:25:42 crc kubenswrapper[4781]: I1208 20:25:42.840342 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.876675158 podStartE2EDuration="7.840324596s" podCreationTimestamp="2025-12-08 20:25:35 +0000 UTC" firstStartedPulling="2025-12-08 20:25:37.3534422 +0000 UTC m=+1253.504725577" lastFinishedPulling="2025-12-08 20:25:41.317091638 +0000 UTC m=+1257.468375015" observedRunningTime="2025-12-08 20:25:42.835914979 +0000 UTC m=+1258.987198356" watchObservedRunningTime="2025-12-08 20:25:42.840324596 +0000 UTC m=+1258.991607993" Dec 08 20:25:42 crc kubenswrapper[4781]: I1208 20:25:42.861222 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.223173504 podStartE2EDuration="7.861202376s" podCreationTimestamp="2025-12-08 20:25:35 +0000 UTC" firstStartedPulling="2025-12-08 20:25:37.653831361 +0000 UTC m=+1253.805114738" lastFinishedPulling="2025-12-08 20:25:41.291860233 +0000 UTC m=+1257.443143610" observedRunningTime="2025-12-08 20:25:42.854121292 +0000 UTC m=+1259.005404689" watchObservedRunningTime="2025-12-08 20:25:42.861202376 +0000 UTC m=+1259.012485753" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.396698 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.492934 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-combined-ca-bundle\") pod \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.493013 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-logs\") pod \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.493083 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwx49\" (UniqueName: \"kubernetes.io/projected/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-kube-api-access-rwx49\") pod \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.493253 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-logs" (OuterVolumeSpecName: "logs") pod "e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" (UID: "e882e0ce-8e47-44af-9f43-b68f9cfdd2fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.493259 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-config-data\") pod \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\" (UID: \"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa\") " Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.493682 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.498680 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-kube-api-access-rwx49" (OuterVolumeSpecName: "kube-api-access-rwx49") pod "e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" (UID: "e882e0ce-8e47-44af-9f43-b68f9cfdd2fa"). InnerVolumeSpecName "kube-api-access-rwx49". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.521999 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-config-data" (OuterVolumeSpecName: "config-data") pod "e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" (UID: "e882e0ce-8e47-44af-9f43-b68f9cfdd2fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.527590 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" (UID: "e882e0ce-8e47-44af-9f43-b68f9cfdd2fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.596277 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.596685 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.596764 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwx49\" (UniqueName: \"kubernetes.io/projected/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa-kube-api-access-rwx49\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.836987 4781 generic.go:334] "Generic (PLEG): container finished" podID="e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" containerID="010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714" exitCode=0 Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.837019 4781 generic.go:334] "Generic (PLEG): container finished" podID="e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" containerID="46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785" exitCode=143 Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.837050 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.837099 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa","Type":"ContainerDied","Data":"010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714"} Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.837134 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa","Type":"ContainerDied","Data":"46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785"} Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.837146 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e882e0ce-8e47-44af-9f43-b68f9cfdd2fa","Type":"ContainerDied","Data":"bd8bfbc9dadc395a93b4e950808e01cbc7c28a77e88400bd0af101183d5a668d"} Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.837162 4781 scope.go:117] "RemoveContainer" containerID="010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.877649 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.877709 4781 scope.go:117] "RemoveContainer" containerID="46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.891291 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.906349 4781 scope.go:117] "RemoveContainer" containerID="010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714" Dec 08 20:25:43 crc kubenswrapper[4781]: E1208 20:25:43.906857 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714\": container with ID starting with 010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714 not found: ID does not exist" containerID="010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.906903 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714"} err="failed to get container status \"010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714\": rpc error: code = NotFound desc = could not find container \"010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714\": container with ID starting with 010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714 not found: ID does not exist" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.906959 4781 scope.go:117] "RemoveContainer" containerID="46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785" Dec 08 20:25:43 crc kubenswrapper[4781]: E1208 20:25:43.907273 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785\": container with ID starting with 46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785 not found: ID does not exist" containerID="46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.907305 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785"} err="failed to get container status \"46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785\": rpc error: code = NotFound desc = could not find container \"46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785\": container with ID starting with 46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785 not found: ID does not exist" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.907325 4781 scope.go:117] "RemoveContainer" containerID="010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.907653 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714"} err="failed to get container status \"010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714\": rpc error: code = NotFound desc = could not find container \"010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714\": container with ID starting with 010f34e06b195bea0c2320e5aa9516a3f1af421efba2815361db147af8043714 not found: ID does not exist" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.907682 4781 scope.go:117] "RemoveContainer" containerID="46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.907747 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:43 crc kubenswrapper[4781]: E1208 20:25:43.908234 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" containerName="nova-metadata-log" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.908253 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" containerName="nova-metadata-log" Dec 08 20:25:43 crc kubenswrapper[4781]: E1208 20:25:43.908270 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" containerName="nova-metadata-metadata" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.908279 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" containerName="nova-metadata-metadata" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.908404 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785"} err="failed to get container status \"46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785\": rpc error: code = NotFound desc = could not find container \"46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785\": container with ID starting with 46a8576f71bfcda17471178c7b44d1175ebcf97c1676f62683406953bc58a785 not found: ID does not exist" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.908511 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" containerName="nova-metadata-metadata" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.908539 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" containerName="nova-metadata-log" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.909989 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.913101 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.913292 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 08 20:25:43 crc kubenswrapper[4781]: I1208 20:25:43.927145 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.005430 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.005541 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c19d492-b7b4-407a-80ed-e13d181ca557-logs\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.005635 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.005661 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x67sb\" (UniqueName: \"kubernetes.io/projected/2c19d492-b7b4-407a-80ed-e13d181ca557-kube-api-access-x67sb\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.005703 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-config-data\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.138576 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.138655 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c19d492-b7b4-407a-80ed-e13d181ca557-logs\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.138734 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.138759 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x67sb\" (UniqueName: \"kubernetes.io/projected/2c19d492-b7b4-407a-80ed-e13d181ca557-kube-api-access-x67sb\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.138799 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-config-data\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.144816 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.146169 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.150506 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c19d492-b7b4-407a-80ed-e13d181ca557-logs\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.172271 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-config-data\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.179039 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x67sb\" (UniqueName: \"kubernetes.io/projected/2c19d492-b7b4-407a-80ed-e13d181ca557-kube-api-access-x67sb\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.183274 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.190299 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e882e0ce-8e47-44af-9f43-b68f9cfdd2fa" path="/var/lib/kubelet/pods/e882e0ce-8e47-44af-9f43-b68f9cfdd2fa/volumes" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.237264 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.733511 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:44 crc kubenswrapper[4781]: I1208 20:25:44.851491 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2c19d492-b7b4-407a-80ed-e13d181ca557","Type":"ContainerStarted","Data":"af991339b5e6e268072b77a0c8e2e92c8d11ca93aba555511489f039c589f244"} Dec 08 20:25:45 crc kubenswrapper[4781]: I1208 20:25:45.527634 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.148:9292/healthcheck\": context deadline exceeded" Dec 08 20:25:45 crc kubenswrapper[4781]: I1208 20:25:45.527689 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="1c87fda8-8a5d-4ee5-8baa-eb5f52d0b373" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.148:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 08 20:25:45 crc kubenswrapper[4781]: I1208 20:25:45.867451 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2c19d492-b7b4-407a-80ed-e13d181ca557","Type":"ContainerStarted","Data":"6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b"} Dec 08 20:25:45 crc kubenswrapper[4781]: I1208 20:25:45.867505 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2c19d492-b7b4-407a-80ed-e13d181ca557","Type":"ContainerStarted","Data":"b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490"} Dec 08 20:25:45 crc kubenswrapper[4781]: I1208 20:25:45.870605 4781 generic.go:334] "Generic (PLEG): container finished" podID="81d1fdcf-4a08-4189-bea7-3e5399286272" containerID="355847e2b51871732e159f6ef626bfcacbe657c99bf6af0223d8dbcdaeaa060c" exitCode=0 Dec 08 20:25:45 crc kubenswrapper[4781]: I1208 20:25:45.870653 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pkxxp" event={"ID":"81d1fdcf-4a08-4189-bea7-3e5399286272","Type":"ContainerDied","Data":"355847e2b51871732e159f6ef626bfcacbe657c99bf6af0223d8dbcdaeaa060c"} Dec 08 20:25:45 crc kubenswrapper[4781]: I1208 20:25:45.895441 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.8954194380000002 podStartE2EDuration="2.895419438s" podCreationTimestamp="2025-12-08 20:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:25:45.8916527 +0000 UTC m=+1262.042936077" watchObservedRunningTime="2025-12-08 20:25:45.895419438 +0000 UTC m=+1262.046702835" Dec 08 20:25:46 crc kubenswrapper[4781]: I1208 20:25:46.178262 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:25:46 crc kubenswrapper[4781]: I1208 20:25:46.349116 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 08 20:25:46 crc kubenswrapper[4781]: I1208 20:25:46.349433 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 08 20:25:46 crc kubenswrapper[4781]: I1208 20:25:46.364293 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 08 20:25:46 crc kubenswrapper[4781]: I1208 20:25:46.364358 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 08 20:25:46 crc kubenswrapper[4781]: I1208 20:25:46.390508 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 08 20:25:46 crc kubenswrapper[4781]: I1208 20:25:46.678212 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:25:46 crc kubenswrapper[4781]: I1208 20:25:46.775346 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-dtltj"] Dec 08 20:25:46 crc kubenswrapper[4781]: I1208 20:25:46.775912 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" podUID="20fec9e5-0d25-4c44-b756-b163add48592" containerName="dnsmasq-dns" containerID="cri-o://6b0f9a288544f4a7fae5a2cafc45ed9c8903e1c6629e11c921c829605067917a" gracePeriod=10 Dec 08 20:25:46 crc kubenswrapper[4781]: I1208 20:25:46.879713 4781 generic.go:334] "Generic (PLEG): container finished" podID="2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb" containerID="8bfd56d687f86072d9c9e144cc9403520c08e92f6bd39dd28b0a4aadc0ccb874" exitCode=0 Dec 08 20:25:46 crc kubenswrapper[4781]: I1208 20:25:46.879881 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sqmzb" event={"ID":"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb","Type":"ContainerDied","Data":"8bfd56d687f86072d9c9e144cc9403520c08e92f6bd39dd28b0a4aadc0ccb874"} Dec 08 20:25:46 crc kubenswrapper[4781]: I1208 20:25:46.967386 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.312873 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.416312 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-combined-ca-bundle\") pod \"81d1fdcf-4a08-4189-bea7-3e5399286272\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.416379 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhx4b\" (UniqueName: \"kubernetes.io/projected/81d1fdcf-4a08-4189-bea7-3e5399286272-kube-api-access-jhx4b\") pod \"81d1fdcf-4a08-4189-bea7-3e5399286272\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.416483 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-config-data\") pod \"81d1fdcf-4a08-4189-bea7-3e5399286272\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.416546 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-scripts\") pod \"81d1fdcf-4a08-4189-bea7-3e5399286272\" (UID: \"81d1fdcf-4a08-4189-bea7-3e5399286272\") " Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.423671 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81d1fdcf-4a08-4189-bea7-3e5399286272-kube-api-access-jhx4b" (OuterVolumeSpecName: "kube-api-access-jhx4b") pod "81d1fdcf-4a08-4189-bea7-3e5399286272" (UID: "81d1fdcf-4a08-4189-bea7-3e5399286272"). InnerVolumeSpecName "kube-api-access-jhx4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.432151 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c978c9d3-9d38-4185-bc0a-e454c11412cb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.433010 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-scripts" (OuterVolumeSpecName: "scripts") pod "81d1fdcf-4a08-4189-bea7-3e5399286272" (UID: "81d1fdcf-4a08-4189-bea7-3e5399286272"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.434145 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c978c9d3-9d38-4185-bc0a-e454c11412cb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.463530 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-config-data" (OuterVolumeSpecName: "config-data") pod "81d1fdcf-4a08-4189-bea7-3e5399286272" (UID: "81d1fdcf-4a08-4189-bea7-3e5399286272"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.463562 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81d1fdcf-4a08-4189-bea7-3e5399286272" (UID: "81d1fdcf-4a08-4189-bea7-3e5399286272"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.487329 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.519402 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.519439 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhx4b\" (UniqueName: \"kubernetes.io/projected/81d1fdcf-4a08-4189-bea7-3e5399286272-kube-api-access-jhx4b\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.519506 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.519571 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81d1fdcf-4a08-4189-bea7-3e5399286272-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.620575 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-dns-swift-storage-0\") pod \"20fec9e5-0d25-4c44-b756-b163add48592\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.620718 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-ovsdbserver-sb\") pod \"20fec9e5-0d25-4c44-b756-b163add48592\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.620779 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzrq6\" (UniqueName: \"kubernetes.io/projected/20fec9e5-0d25-4c44-b756-b163add48592-kube-api-access-dzrq6\") pod \"20fec9e5-0d25-4c44-b756-b163add48592\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.620806 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-ovsdbserver-nb\") pod \"20fec9e5-0d25-4c44-b756-b163add48592\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.620859 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-config\") pod \"20fec9e5-0d25-4c44-b756-b163add48592\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.620926 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-dns-svc\") pod \"20fec9e5-0d25-4c44-b756-b163add48592\" (UID: \"20fec9e5-0d25-4c44-b756-b163add48592\") " Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.658372 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20fec9e5-0d25-4c44-b756-b163add48592-kube-api-access-dzrq6" (OuterVolumeSpecName: "kube-api-access-dzrq6") pod "20fec9e5-0d25-4c44-b756-b163add48592" (UID: "20fec9e5-0d25-4c44-b756-b163add48592"). InnerVolumeSpecName "kube-api-access-dzrq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.724087 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzrq6\" (UniqueName: \"kubernetes.io/projected/20fec9e5-0d25-4c44-b756-b163add48592-kube-api-access-dzrq6\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.749345 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "20fec9e5-0d25-4c44-b756-b163add48592" (UID: "20fec9e5-0d25-4c44-b756-b163add48592"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.754192 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "20fec9e5-0d25-4c44-b756-b163add48592" (UID: "20fec9e5-0d25-4c44-b756-b163add48592"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.756042 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-config" (OuterVolumeSpecName: "config") pod "20fec9e5-0d25-4c44-b756-b163add48592" (UID: "20fec9e5-0d25-4c44-b756-b163add48592"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.760245 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "20fec9e5-0d25-4c44-b756-b163add48592" (UID: "20fec9e5-0d25-4c44-b756-b163add48592"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.783566 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "20fec9e5-0d25-4c44-b756-b163add48592" (UID: "20fec9e5-0d25-4c44-b756-b163add48592"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.825636 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.825664 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.825676 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.825686 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.825694 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20fec9e5-0d25-4c44-b756-b163add48592-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.889436 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pkxxp" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.890271 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pkxxp" event={"ID":"81d1fdcf-4a08-4189-bea7-3e5399286272","Type":"ContainerDied","Data":"3cbb70bd8d9756ec48bd905e558b5a18d331f86739f0682ebb727239c3a858f2"} Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.890312 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cbb70bd8d9756ec48bd905e558b5a18d331f86739f0682ebb727239c3a858f2" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.891686 4781 generic.go:334] "Generic (PLEG): container finished" podID="20fec9e5-0d25-4c44-b756-b163add48592" containerID="6b0f9a288544f4a7fae5a2cafc45ed9c8903e1c6629e11c921c829605067917a" exitCode=0 Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.891878 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.893050 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" event={"ID":"20fec9e5-0d25-4c44-b756-b163add48592","Type":"ContainerDied","Data":"6b0f9a288544f4a7fae5a2cafc45ed9c8903e1c6629e11c921c829605067917a"} Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.893123 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-dtltj" event={"ID":"20fec9e5-0d25-4c44-b756-b163add48592","Type":"ContainerDied","Data":"6efe0e059ef03846acd9b979b140a228319bebfc010094c07203bc86ef5d307e"} Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.893144 4781 scope.go:117] "RemoveContainer" containerID="6b0f9a288544f4a7fae5a2cafc45ed9c8903e1c6629e11c921c829605067917a" Dec 08 20:25:47 crc kubenswrapper[4781]: I1208 20:25:47.983028 4781 scope.go:117] "RemoveContainer" containerID="87ce7ddc2aae93096a51125a15c056f85ca7022dde8f3a864a82ebc8e08b9793" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.031206 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-dtltj"] Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.070707 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-dtltj"] Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.089132 4781 scope.go:117] "RemoveContainer" containerID="6b0f9a288544f4a7fae5a2cafc45ed9c8903e1c6629e11c921c829605067917a" Dec 08 20:25:48 crc kubenswrapper[4781]: E1208 20:25:48.089635 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b0f9a288544f4a7fae5a2cafc45ed9c8903e1c6629e11c921c829605067917a\": container with ID starting with 6b0f9a288544f4a7fae5a2cafc45ed9c8903e1c6629e11c921c829605067917a not found: ID does not exist" containerID="6b0f9a288544f4a7fae5a2cafc45ed9c8903e1c6629e11c921c829605067917a" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.089675 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b0f9a288544f4a7fae5a2cafc45ed9c8903e1c6629e11c921c829605067917a"} err="failed to get container status \"6b0f9a288544f4a7fae5a2cafc45ed9c8903e1c6629e11c921c829605067917a\": rpc error: code = NotFound desc = could not find container \"6b0f9a288544f4a7fae5a2cafc45ed9c8903e1c6629e11c921c829605067917a\": container with ID starting with 6b0f9a288544f4a7fae5a2cafc45ed9c8903e1c6629e11c921c829605067917a not found: ID does not exist" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.089702 4781 scope.go:117] "RemoveContainer" containerID="87ce7ddc2aae93096a51125a15c056f85ca7022dde8f3a864a82ebc8e08b9793" Dec 08 20:25:48 crc kubenswrapper[4781]: E1208 20:25:48.089952 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87ce7ddc2aae93096a51125a15c056f85ca7022dde8f3a864a82ebc8e08b9793\": container with ID starting with 87ce7ddc2aae93096a51125a15c056f85ca7022dde8f3a864a82ebc8e08b9793 not found: ID does not exist" containerID="87ce7ddc2aae93096a51125a15c056f85ca7022dde8f3a864a82ebc8e08b9793" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.089986 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87ce7ddc2aae93096a51125a15c056f85ca7022dde8f3a864a82ebc8e08b9793"} err="failed to get container status \"87ce7ddc2aae93096a51125a15c056f85ca7022dde8f3a864a82ebc8e08b9793\": rpc error: code = NotFound desc = could not find container \"87ce7ddc2aae93096a51125a15c056f85ca7022dde8f3a864a82ebc8e08b9793\": container with ID starting with 87ce7ddc2aae93096a51125a15c056f85ca7022dde8f3a864a82ebc8e08b9793 not found: ID does not exist" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.107535 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.107785 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c978c9d3-9d38-4185-bc0a-e454c11412cb" containerName="nova-api-log" containerID="cri-o://6ea7c289c27d7253d426e59544f6cd40df7fb39aad538b48126cc0592ac81868" gracePeriod=30 Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.108246 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c978c9d3-9d38-4185-bc0a-e454c11412cb" containerName="nova-api-api" containerID="cri-o://a772c01eae0bd665e745f0a410760fd00cf88f76ecc71dcb54690d0809af4196" gracePeriod=30 Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.277865 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20fec9e5-0d25-4c44-b756-b163add48592" path="/var/lib/kubelet/pods/20fec9e5-0d25-4c44-b756-b163add48592/volumes" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.278574 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.278608 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.278803 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2c19d492-b7b4-407a-80ed-e13d181ca557" containerName="nova-metadata-log" containerID="cri-o://b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490" gracePeriod=30 Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.279223 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2c19d492-b7b4-407a-80ed-e13d181ca557" containerName="nova-metadata-metadata" containerID="cri-o://6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b" gracePeriod=30 Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.547764 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.732794 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-config-data\") pod \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.733714 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t27kz\" (UniqueName: \"kubernetes.io/projected/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-kube-api-access-t27kz\") pod \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.733815 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-combined-ca-bundle\") pod \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.733848 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-scripts\") pod \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\" (UID: \"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb\") " Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.739561 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-kube-api-access-t27kz" (OuterVolumeSpecName: "kube-api-access-t27kz") pod "2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb" (UID: "2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb"). InnerVolumeSpecName "kube-api-access-t27kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.740202 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-scripts" (OuterVolumeSpecName: "scripts") pod "2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb" (UID: "2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.767125 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-config-data" (OuterVolumeSpecName: "config-data") pod "2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb" (UID: "2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.788106 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb" (UID: "2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.810148 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.837492 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.837538 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t27kz\" (UniqueName: \"kubernetes.io/projected/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-kube-api-access-t27kz\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.837570 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.837583 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.910314 4781 generic.go:334] "Generic (PLEG): container finished" podID="2c19d492-b7b4-407a-80ed-e13d181ca557" containerID="6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b" exitCode=0 Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.910345 4781 generic.go:334] "Generic (PLEG): container finished" podID="2c19d492-b7b4-407a-80ed-e13d181ca557" containerID="b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490" exitCode=143 Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.910385 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2c19d492-b7b4-407a-80ed-e13d181ca557","Type":"ContainerDied","Data":"6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b"} Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.910412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2c19d492-b7b4-407a-80ed-e13d181ca557","Type":"ContainerDied","Data":"b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490"} Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.910422 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2c19d492-b7b4-407a-80ed-e13d181ca557","Type":"ContainerDied","Data":"af991339b5e6e268072b77a0c8e2e92c8d11ca93aba555511489f039c589f244"} Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.910437 4781 scope.go:117] "RemoveContainer" containerID="6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.910528 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.919987 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sqmzb" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.920308 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sqmzb" event={"ID":"2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb","Type":"ContainerDied","Data":"72ec510065fc32850b85dce29fc5a073b0977d68208512f71224b132c15235f7"} Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.920336 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72ec510065fc32850b85dce29fc5a073b0977d68208512f71224b132c15235f7" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.922241 4781 generic.go:334] "Generic (PLEG): container finished" podID="c978c9d3-9d38-4185-bc0a-e454c11412cb" containerID="6ea7c289c27d7253d426e59544f6cd40df7fb39aad538b48126cc0592ac81868" exitCode=143 Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.922306 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c978c9d3-9d38-4185-bc0a-e454c11412cb","Type":"ContainerDied","Data":"6ea7c289c27d7253d426e59544f6cd40df7fb39aad538b48126cc0592ac81868"} Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.924179 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0e30c528-158d-438a-8d03-a26e98107b90" containerName="nova-scheduler-scheduler" containerID="cri-o://e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd" gracePeriod=30 Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.939498 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x67sb\" (UniqueName: \"kubernetes.io/projected/2c19d492-b7b4-407a-80ed-e13d181ca557-kube-api-access-x67sb\") pod \"2c19d492-b7b4-407a-80ed-e13d181ca557\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.939545 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-combined-ca-bundle\") pod \"2c19d492-b7b4-407a-80ed-e13d181ca557\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.939571 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-config-data\") pod \"2c19d492-b7b4-407a-80ed-e13d181ca557\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.939599 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-nova-metadata-tls-certs\") pod \"2c19d492-b7b4-407a-80ed-e13d181ca557\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.939709 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c19d492-b7b4-407a-80ed-e13d181ca557-logs\") pod \"2c19d492-b7b4-407a-80ed-e13d181ca557\" (UID: \"2c19d492-b7b4-407a-80ed-e13d181ca557\") " Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.940879 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c19d492-b7b4-407a-80ed-e13d181ca557-logs" (OuterVolumeSpecName: "logs") pod "2c19d492-b7b4-407a-80ed-e13d181ca557" (UID: "2c19d492-b7b4-407a-80ed-e13d181ca557"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.949506 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c19d492-b7b4-407a-80ed-e13d181ca557-kube-api-access-x67sb" (OuterVolumeSpecName: "kube-api-access-x67sb") pod "2c19d492-b7b4-407a-80ed-e13d181ca557" (UID: "2c19d492-b7b4-407a-80ed-e13d181ca557"). InnerVolumeSpecName "kube-api-access-x67sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.986604 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c19d492-b7b4-407a-80ed-e13d181ca557" (UID: "2c19d492-b7b4-407a-80ed-e13d181ca557"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.998230 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 08 20:25:48 crc kubenswrapper[4781]: E1208 20:25:48.998669 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20fec9e5-0d25-4c44-b756-b163add48592" containerName="init" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.998685 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20fec9e5-0d25-4c44-b756-b163add48592" containerName="init" Dec 08 20:25:48 crc kubenswrapper[4781]: E1208 20:25:48.998699 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d1fdcf-4a08-4189-bea7-3e5399286272" containerName="nova-manage" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.998706 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d1fdcf-4a08-4189-bea7-3e5399286272" containerName="nova-manage" Dec 08 20:25:48 crc kubenswrapper[4781]: E1208 20:25:48.998724 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c19d492-b7b4-407a-80ed-e13d181ca557" containerName="nova-metadata-log" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.998730 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c19d492-b7b4-407a-80ed-e13d181ca557" containerName="nova-metadata-log" Dec 08 20:25:48 crc kubenswrapper[4781]: E1208 20:25:48.998744 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20fec9e5-0d25-4c44-b756-b163add48592" containerName="dnsmasq-dns" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.998750 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="20fec9e5-0d25-4c44-b756-b163add48592" containerName="dnsmasq-dns" Dec 08 20:25:48 crc kubenswrapper[4781]: E1208 20:25:48.998757 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb" containerName="nova-cell1-conductor-db-sync" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.998763 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb" containerName="nova-cell1-conductor-db-sync" Dec 08 20:25:48 crc kubenswrapper[4781]: E1208 20:25:48.998784 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c19d492-b7b4-407a-80ed-e13d181ca557" containerName="nova-metadata-metadata" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.998790 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c19d492-b7b4-407a-80ed-e13d181ca557" containerName="nova-metadata-metadata" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.998968 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c19d492-b7b4-407a-80ed-e13d181ca557" containerName="nova-metadata-log" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.998991 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="81d1fdcf-4a08-4189-bea7-3e5399286272" containerName="nova-manage" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.998999 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c19d492-b7b4-407a-80ed-e13d181ca557" containerName="nova-metadata-metadata" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.999009 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb" containerName="nova-cell1-conductor-db-sync" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.999023 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="20fec9e5-0d25-4c44-b756-b163add48592" containerName="dnsmasq-dns" Dec 08 20:25:48 crc kubenswrapper[4781]: I1208 20:25:48.999614 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.006698 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.045389 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c19d492-b7b4-407a-80ed-e13d181ca557-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.045447 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x67sb\" (UniqueName: \"kubernetes.io/projected/2c19d492-b7b4-407a-80ed-e13d181ca557-kube-api-access-x67sb\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.045490 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.054817 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.062105 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-config-data" (OuterVolumeSpecName: "config-data") pod "2c19d492-b7b4-407a-80ed-e13d181ca557" (UID: "2c19d492-b7b4-407a-80ed-e13d181ca557"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.087762 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2c19d492-b7b4-407a-80ed-e13d181ca557" (UID: "2c19d492-b7b4-407a-80ed-e13d181ca557"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.149029 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1b29f3-e2a8-4a48-9d18-8c502c3f435c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"af1b29f3-e2a8-4a48-9d18-8c502c3f435c\") " pod="openstack/nova-cell1-conductor-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.149082 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnjpx\" (UniqueName: \"kubernetes.io/projected/af1b29f3-e2a8-4a48-9d18-8c502c3f435c-kube-api-access-fnjpx\") pod \"nova-cell1-conductor-0\" (UID: \"af1b29f3-e2a8-4a48-9d18-8c502c3f435c\") " pod="openstack/nova-cell1-conductor-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.149104 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1b29f3-e2a8-4a48-9d18-8c502c3f435c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"af1b29f3-e2a8-4a48-9d18-8c502c3f435c\") " pod="openstack/nova-cell1-conductor-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.149177 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.149190 4781 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c19d492-b7b4-407a-80ed-e13d181ca557-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.181224 4781 scope.go:117] "RemoveContainer" containerID="b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.201068 4781 scope.go:117] "RemoveContainer" containerID="6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b" Dec 08 20:25:49 crc kubenswrapper[4781]: E1208 20:25:49.201680 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b\": container with ID starting with 6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b not found: ID does not exist" containerID="6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.201716 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b"} err="failed to get container status \"6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b\": rpc error: code = NotFound desc = could not find container \"6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b\": container with ID starting with 6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b not found: ID does not exist" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.201740 4781 scope.go:117] "RemoveContainer" containerID="b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490" Dec 08 20:25:49 crc kubenswrapper[4781]: E1208 20:25:49.202293 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490\": container with ID starting with b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490 not found: ID does not exist" containerID="b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.202326 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490"} err="failed to get container status \"b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490\": rpc error: code = NotFound desc = could not find container \"b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490\": container with ID starting with b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490 not found: ID does not exist" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.202345 4781 scope.go:117] "RemoveContainer" containerID="6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.202634 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b"} err="failed to get container status \"6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b\": rpc error: code = NotFound desc = could not find container \"6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b\": container with ID starting with 6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b not found: ID does not exist" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.202676 4781 scope.go:117] "RemoveContainer" containerID="b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.202987 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490"} err="failed to get container status \"b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490\": rpc error: code = NotFound desc = could not find container \"b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490\": container with ID starting with b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490 not found: ID does not exist" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.250757 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1b29f3-e2a8-4a48-9d18-8c502c3f435c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"af1b29f3-e2a8-4a48-9d18-8c502c3f435c\") " pod="openstack/nova-cell1-conductor-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.250824 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnjpx\" (UniqueName: \"kubernetes.io/projected/af1b29f3-e2a8-4a48-9d18-8c502c3f435c-kube-api-access-fnjpx\") pod \"nova-cell1-conductor-0\" (UID: \"af1b29f3-e2a8-4a48-9d18-8c502c3f435c\") " pod="openstack/nova-cell1-conductor-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.250852 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1b29f3-e2a8-4a48-9d18-8c502c3f435c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"af1b29f3-e2a8-4a48-9d18-8c502c3f435c\") " pod="openstack/nova-cell1-conductor-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.251194 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.255309 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af1b29f3-e2a8-4a48-9d18-8c502c3f435c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"af1b29f3-e2a8-4a48-9d18-8c502c3f435c\") " pod="openstack/nova-cell1-conductor-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.261672 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1b29f3-e2a8-4a48-9d18-8c502c3f435c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"af1b29f3-e2a8-4a48-9d18-8c502c3f435c\") " pod="openstack/nova-cell1-conductor-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.270885 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.279534 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnjpx\" (UniqueName: \"kubernetes.io/projected/af1b29f3-e2a8-4a48-9d18-8c502c3f435c-kube-api-access-fnjpx\") pod \"nova-cell1-conductor-0\" (UID: \"af1b29f3-e2a8-4a48-9d18-8c502c3f435c\") " pod="openstack/nova-cell1-conductor-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.300071 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.301598 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.304093 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.304213 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.310390 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.453892 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.454063 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/895bd81e-c873-46ce-99cd-526084be1061-logs\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.454158 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knh55\" (UniqueName: \"kubernetes.io/projected/895bd81e-c873-46ce-99cd-526084be1061-kube-api-access-knh55\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.454184 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.454397 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-config-data\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.491982 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.555968 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.556075 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-config-data\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.556170 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.556228 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/895bd81e-c873-46ce-99cd-526084be1061-logs\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.556293 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knh55\" (UniqueName: \"kubernetes.io/projected/895bd81e-c873-46ce-99cd-526084be1061-kube-api-access-knh55\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.557231 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/895bd81e-c873-46ce-99cd-526084be1061-logs\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.560414 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.561055 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-config-data\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.563322 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.578475 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knh55\" (UniqueName: \"kubernetes.io/projected/895bd81e-c873-46ce-99cd-526084be1061-kube-api-access-knh55\") pod \"nova-metadata-0\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.628158 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:25:49 crc kubenswrapper[4781]: I1208 20:25:49.944306 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 08 20:25:50 crc kubenswrapper[4781]: I1208 20:25:50.069446 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:25:50 crc kubenswrapper[4781]: I1208 20:25:50.138379 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c19d492-b7b4-407a-80ed-e13d181ca557" path="/var/lib/kubelet/pods/2c19d492-b7b4-407a-80ed-e13d181ca557/volumes" Dec 08 20:25:50 crc kubenswrapper[4781]: I1208 20:25:50.946103 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"895bd81e-c873-46ce-99cd-526084be1061","Type":"ContainerStarted","Data":"e41071751f26fa57feb64e4393391d3ad389d07bc8b90a5570edc8a0d3faf1e2"} Dec 08 20:25:50 crc kubenswrapper[4781]: I1208 20:25:50.946145 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"895bd81e-c873-46ce-99cd-526084be1061","Type":"ContainerStarted","Data":"44083416c5d1354bc46a5a47c270e65e34e16d44bda08f0843179121a6f66869"} Dec 08 20:25:50 crc kubenswrapper[4781]: I1208 20:25:50.946159 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"895bd81e-c873-46ce-99cd-526084be1061","Type":"ContainerStarted","Data":"695c343718483ccfb2e522b88d307785555ce4278d10ddd8c6db24e339908229"} Dec 08 20:25:50 crc kubenswrapper[4781]: I1208 20:25:50.947611 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"af1b29f3-e2a8-4a48-9d18-8c502c3f435c","Type":"ContainerStarted","Data":"57e09f27647f1274d8509e34b0d4d3b33a839928d24524a050e3f57200fd45f9"} Dec 08 20:25:50 crc kubenswrapper[4781]: I1208 20:25:50.947660 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"af1b29f3-e2a8-4a48-9d18-8c502c3f435c","Type":"ContainerStarted","Data":"50b859b3d411ab6c8c3f2e990df02cc5de6c4d56fc6f9c6aa6000bf8508c4707"} Dec 08 20:25:50 crc kubenswrapper[4781]: I1208 20:25:50.948331 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 08 20:25:50 crc kubenswrapper[4781]: I1208 20:25:50.996969 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9969500409999998 podStartE2EDuration="1.996950041s" podCreationTimestamp="2025-12-08 20:25:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:25:50.971895051 +0000 UTC m=+1267.123178438" watchObservedRunningTime="2025-12-08 20:25:50.996950041 +0000 UTC m=+1267.148233418" Dec 08 20:25:51 crc kubenswrapper[4781]: E1208 20:25:51.368716 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 08 20:25:51 crc kubenswrapper[4781]: E1208 20:25:51.373407 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 08 20:25:51 crc kubenswrapper[4781]: E1208 20:25:51.375227 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 08 20:25:51 crc kubenswrapper[4781]: E1208 20:25:51.375428 4781 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0e30c528-158d-438a-8d03-a26e98107b90" containerName="nova-scheduler-scheduler" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.514386 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.547335 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=5.547312801 podStartE2EDuration="5.547312801s" podCreationTimestamp="2025-12-08 20:25:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:25:50.997569319 +0000 UTC m=+1267.148852696" watchObservedRunningTime="2025-12-08 20:25:53.547312801 +0000 UTC m=+1269.698596178" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.671330 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8s2x\" (UniqueName: \"kubernetes.io/projected/0e30c528-158d-438a-8d03-a26e98107b90-kube-api-access-h8s2x\") pod \"0e30c528-158d-438a-8d03-a26e98107b90\" (UID: \"0e30c528-158d-438a-8d03-a26e98107b90\") " Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.673512 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e30c528-158d-438a-8d03-a26e98107b90-combined-ca-bundle\") pod \"0e30c528-158d-438a-8d03-a26e98107b90\" (UID: \"0e30c528-158d-438a-8d03-a26e98107b90\") " Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.673666 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e30c528-158d-438a-8d03-a26e98107b90-config-data\") pod \"0e30c528-158d-438a-8d03-a26e98107b90\" (UID: \"0e30c528-158d-438a-8d03-a26e98107b90\") " Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.677690 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e30c528-158d-438a-8d03-a26e98107b90-kube-api-access-h8s2x" (OuterVolumeSpecName: "kube-api-access-h8s2x") pod "0e30c528-158d-438a-8d03-a26e98107b90" (UID: "0e30c528-158d-438a-8d03-a26e98107b90"). InnerVolumeSpecName "kube-api-access-h8s2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.709154 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e30c528-158d-438a-8d03-a26e98107b90-config-data" (OuterVolumeSpecName: "config-data") pod "0e30c528-158d-438a-8d03-a26e98107b90" (UID: "0e30c528-158d-438a-8d03-a26e98107b90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.723134 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e30c528-158d-438a-8d03-a26e98107b90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e30c528-158d-438a-8d03-a26e98107b90" (UID: "0e30c528-158d-438a-8d03-a26e98107b90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.776134 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e30c528-158d-438a-8d03-a26e98107b90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.776163 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e30c528-158d-438a-8d03-a26e98107b90-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.776173 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8s2x\" (UniqueName: \"kubernetes.io/projected/0e30c528-158d-438a-8d03-a26e98107b90-kube-api-access-h8s2x\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.969178 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.977686 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c978c9d3-9d38-4185-bc0a-e454c11412cb-combined-ca-bundle\") pod \"c978c9d3-9d38-4185-bc0a-e454c11412cb\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.977790 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnfrp\" (UniqueName: \"kubernetes.io/projected/c978c9d3-9d38-4185-bc0a-e454c11412cb-kube-api-access-nnfrp\") pod \"c978c9d3-9d38-4185-bc0a-e454c11412cb\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.977890 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c978c9d3-9d38-4185-bc0a-e454c11412cb-logs\") pod \"c978c9d3-9d38-4185-bc0a-e454c11412cb\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.977911 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c978c9d3-9d38-4185-bc0a-e454c11412cb-config-data\") pod \"c978c9d3-9d38-4185-bc0a-e454c11412cb\" (UID: \"c978c9d3-9d38-4185-bc0a-e454c11412cb\") " Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.978470 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c978c9d3-9d38-4185-bc0a-e454c11412cb-logs" (OuterVolumeSpecName: "logs") pod "c978c9d3-9d38-4185-bc0a-e454c11412cb" (UID: "c978c9d3-9d38-4185-bc0a-e454c11412cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.979098 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c978c9d3-9d38-4185-bc0a-e454c11412cb-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.981575 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c978c9d3-9d38-4185-bc0a-e454c11412cb-kube-api-access-nnfrp" (OuterVolumeSpecName: "kube-api-access-nnfrp") pod "c978c9d3-9d38-4185-bc0a-e454c11412cb" (UID: "c978c9d3-9d38-4185-bc0a-e454c11412cb"). InnerVolumeSpecName "kube-api-access-nnfrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.998681 4781 generic.go:334] "Generic (PLEG): container finished" podID="c978c9d3-9d38-4185-bc0a-e454c11412cb" containerID="a772c01eae0bd665e745f0a410760fd00cf88f76ecc71dcb54690d0809af4196" exitCode=0 Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.998947 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c978c9d3-9d38-4185-bc0a-e454c11412cb","Type":"ContainerDied","Data":"a772c01eae0bd665e745f0a410760fd00cf88f76ecc71dcb54690d0809af4196"} Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.999037 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c978c9d3-9d38-4185-bc0a-e454c11412cb","Type":"ContainerDied","Data":"89016b90ade57f2479f57e042dd98b73ab21aa8a786e6368c07e0b466a83b73b"} Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.999103 4781 scope.go:117] "RemoveContainer" containerID="a772c01eae0bd665e745f0a410760fd00cf88f76ecc71dcb54690d0809af4196" Dec 08 20:25:53 crc kubenswrapper[4781]: I1208 20:25:53.999305 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.007449 4781 generic.go:334] "Generic (PLEG): container finished" podID="0e30c528-158d-438a-8d03-a26e98107b90" containerID="e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd" exitCode=0 Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.007495 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e30c528-158d-438a-8d03-a26e98107b90","Type":"ContainerDied","Data":"e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd"} Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.007527 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e30c528-158d-438a-8d03-a26e98107b90","Type":"ContainerDied","Data":"3ff8c17b3ca31768b3b0d59e66bb488bad3a503df083d1c24d00f936359ab954"} Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.007595 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.016064 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c978c9d3-9d38-4185-bc0a-e454c11412cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c978c9d3-9d38-4185-bc0a-e454c11412cb" (UID: "c978c9d3-9d38-4185-bc0a-e454c11412cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.020035 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c978c9d3-9d38-4185-bc0a-e454c11412cb-config-data" (OuterVolumeSpecName: "config-data") pod "c978c9d3-9d38-4185-bc0a-e454c11412cb" (UID: "c978c9d3-9d38-4185-bc0a-e454c11412cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.025439 4781 scope.go:117] "RemoveContainer" containerID="6ea7c289c27d7253d426e59544f6cd40df7fb39aad538b48126cc0592ac81868" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.050127 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.057514 4781 scope.go:117] "RemoveContainer" containerID="a772c01eae0bd665e745f0a410760fd00cf88f76ecc71dcb54690d0809af4196" Dec 08 20:25:54 crc kubenswrapper[4781]: E1208 20:25:54.058531 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a772c01eae0bd665e745f0a410760fd00cf88f76ecc71dcb54690d0809af4196\": container with ID starting with a772c01eae0bd665e745f0a410760fd00cf88f76ecc71dcb54690d0809af4196 not found: ID does not exist" containerID="a772c01eae0bd665e745f0a410760fd00cf88f76ecc71dcb54690d0809af4196" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.058567 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a772c01eae0bd665e745f0a410760fd00cf88f76ecc71dcb54690d0809af4196"} err="failed to get container status \"a772c01eae0bd665e745f0a410760fd00cf88f76ecc71dcb54690d0809af4196\": rpc error: code = NotFound desc = could not find container \"a772c01eae0bd665e745f0a410760fd00cf88f76ecc71dcb54690d0809af4196\": container with ID starting with a772c01eae0bd665e745f0a410760fd00cf88f76ecc71dcb54690d0809af4196 not found: ID does not exist" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.058587 4781 scope.go:117] "RemoveContainer" containerID="6ea7c289c27d7253d426e59544f6cd40df7fb39aad538b48126cc0592ac81868" Dec 08 20:25:54 crc kubenswrapper[4781]: E1208 20:25:54.059774 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea7c289c27d7253d426e59544f6cd40df7fb39aad538b48126cc0592ac81868\": container with ID starting with 6ea7c289c27d7253d426e59544f6cd40df7fb39aad538b48126cc0592ac81868 not found: ID does not exist" containerID="6ea7c289c27d7253d426e59544f6cd40df7fb39aad538b48126cc0592ac81868" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.059806 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea7c289c27d7253d426e59544f6cd40df7fb39aad538b48126cc0592ac81868"} err="failed to get container status \"6ea7c289c27d7253d426e59544f6cd40df7fb39aad538b48126cc0592ac81868\": rpc error: code = NotFound desc = could not find container \"6ea7c289c27d7253d426e59544f6cd40df7fb39aad538b48126cc0592ac81868\": container with ID starting with 6ea7c289c27d7253d426e59544f6cd40df7fb39aad538b48126cc0592ac81868 not found: ID does not exist" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.059854 4781 scope.go:117] "RemoveContainer" containerID="e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.062580 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.072570 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:25:54 crc kubenswrapper[4781]: E1208 20:25:54.073014 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c978c9d3-9d38-4185-bc0a-e454c11412cb" containerName="nova-api-log" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.073041 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c978c9d3-9d38-4185-bc0a-e454c11412cb" containerName="nova-api-log" Dec 08 20:25:54 crc kubenswrapper[4781]: E1208 20:25:54.073077 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c978c9d3-9d38-4185-bc0a-e454c11412cb" containerName="nova-api-api" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.073086 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c978c9d3-9d38-4185-bc0a-e454c11412cb" containerName="nova-api-api" Dec 08 20:25:54 crc kubenswrapper[4781]: E1208 20:25:54.073111 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e30c528-158d-438a-8d03-a26e98107b90" containerName="nova-scheduler-scheduler" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.073119 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e30c528-158d-438a-8d03-a26e98107b90" containerName="nova-scheduler-scheduler" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.073333 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c978c9d3-9d38-4185-bc0a-e454c11412cb" containerName="nova-api-api" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.073371 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e30c528-158d-438a-8d03-a26e98107b90" containerName="nova-scheduler-scheduler" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.073394 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c978c9d3-9d38-4185-bc0a-e454c11412cb" containerName="nova-api-log" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.074124 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.076683 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.081210 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3331e6e-3384-4cc5-af5a-c24583a6865b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3331e6e-3384-4cc5-af5a-c24583a6865b\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.081259 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3331e6e-3384-4cc5-af5a-c24583a6865b-config-data\") pod \"nova-scheduler-0\" (UID: \"e3331e6e-3384-4cc5-af5a-c24583a6865b\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.081366 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmtfg\" (UniqueName: \"kubernetes.io/projected/e3331e6e-3384-4cc5-af5a-c24583a6865b-kube-api-access-bmtfg\") pod \"nova-scheduler-0\" (UID: \"e3331e6e-3384-4cc5-af5a-c24583a6865b\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.081485 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c978c9d3-9d38-4185-bc0a-e454c11412cb-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.081497 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c978c9d3-9d38-4185-bc0a-e454c11412cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.081531 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnfrp\" (UniqueName: \"kubernetes.io/projected/c978c9d3-9d38-4185-bc0a-e454c11412cb-kube-api-access-nnfrp\") on node \"crc\" DevicePath \"\"" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.084093 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.087984 4781 scope.go:117] "RemoveContainer" containerID="e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd" Dec 08 20:25:54 crc kubenswrapper[4781]: E1208 20:25:54.093227 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd\": container with ID starting with e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd not found: ID does not exist" containerID="e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.093278 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd"} err="failed to get container status \"e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd\": rpc error: code = NotFound desc = could not find container \"e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd\": container with ID starting with e6c6e4f70896451b7f25a1ebe9eaab999ae056aae9c6711c95eaa40ede96c5fd not found: ID does not exist" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.147293 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e30c528-158d-438a-8d03-a26e98107b90" path="/var/lib/kubelet/pods/0e30c528-158d-438a-8d03-a26e98107b90/volumes" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.183447 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3331e6e-3384-4cc5-af5a-c24583a6865b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3331e6e-3384-4cc5-af5a-c24583a6865b\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.183543 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3331e6e-3384-4cc5-af5a-c24583a6865b-config-data\") pod \"nova-scheduler-0\" (UID: \"e3331e6e-3384-4cc5-af5a-c24583a6865b\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.184195 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmtfg\" (UniqueName: \"kubernetes.io/projected/e3331e6e-3384-4cc5-af5a-c24583a6865b-kube-api-access-bmtfg\") pod \"nova-scheduler-0\" (UID: \"e3331e6e-3384-4cc5-af5a-c24583a6865b\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.187471 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3331e6e-3384-4cc5-af5a-c24583a6865b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e3331e6e-3384-4cc5-af5a-c24583a6865b\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.190704 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3331e6e-3384-4cc5-af5a-c24583a6865b-config-data\") pod \"nova-scheduler-0\" (UID: \"e3331e6e-3384-4cc5-af5a-c24583a6865b\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.201226 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmtfg\" (UniqueName: \"kubernetes.io/projected/e3331e6e-3384-4cc5-af5a-c24583a6865b-kube-api-access-bmtfg\") pod \"nova-scheduler-0\" (UID: \"e3331e6e-3384-4cc5-af5a-c24583a6865b\") " pod="openstack/nova-scheduler-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.339323 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.352732 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.366101 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.385384 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.389206 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.392332 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.400591 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.501784 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0735293f-2f40-4edc-9184-ee889e5784d8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.501844 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0735293f-2f40-4edc-9184-ee889e5784d8-config-data\") pod \"nova-api-0\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.501895 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0735293f-2f40-4edc-9184-ee889e5784d8-logs\") pod \"nova-api-0\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.502383 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d67dr\" (UniqueName: \"kubernetes.io/projected/0735293f-2f40-4edc-9184-ee889e5784d8-kube-api-access-d67dr\") pod \"nova-api-0\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.604607 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0735293f-2f40-4edc-9184-ee889e5784d8-config-data\") pod \"nova-api-0\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.604981 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0735293f-2f40-4edc-9184-ee889e5784d8-logs\") pod \"nova-api-0\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.605024 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d67dr\" (UniqueName: \"kubernetes.io/projected/0735293f-2f40-4edc-9184-ee889e5784d8-kube-api-access-d67dr\") pod \"nova-api-0\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.605152 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0735293f-2f40-4edc-9184-ee889e5784d8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.605753 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0735293f-2f40-4edc-9184-ee889e5784d8-logs\") pod \"nova-api-0\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.609906 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0735293f-2f40-4edc-9184-ee889e5784d8-config-data\") pod \"nova-api-0\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.612369 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0735293f-2f40-4edc-9184-ee889e5784d8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.624345 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d67dr\" (UniqueName: \"kubernetes.io/projected/0735293f-2f40-4edc-9184-ee889e5784d8-kube-api-access-d67dr\") pod \"nova-api-0\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.629875 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.629947 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.706046 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:25:54 crc kubenswrapper[4781]: I1208 20:25:54.843313 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:25:55 crc kubenswrapper[4781]: I1208 20:25:55.020541 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3331e6e-3384-4cc5-af5a-c24583a6865b","Type":"ContainerStarted","Data":"9031ffd6ba7a2ee2365d0552f2c17e140f0aab1dde07f05a63e2ae5661475ea1"} Dec 08 20:25:55 crc kubenswrapper[4781]: I1208 20:25:55.134983 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:25:55 crc kubenswrapper[4781]: I1208 20:25:55.913740 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 08 20:25:56 crc kubenswrapper[4781]: I1208 20:25:56.035740 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3331e6e-3384-4cc5-af5a-c24583a6865b","Type":"ContainerStarted","Data":"7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8"} Dec 08 20:25:56 crc kubenswrapper[4781]: I1208 20:25:56.039182 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0735293f-2f40-4edc-9184-ee889e5784d8","Type":"ContainerStarted","Data":"bab0a9d8a6109c212ee213dd324196287e50cde63fc6ce488cc329909bedc297"} Dec 08 20:25:56 crc kubenswrapper[4781]: I1208 20:25:56.039240 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0735293f-2f40-4edc-9184-ee889e5784d8","Type":"ContainerStarted","Data":"4903da7461dc5b6f5cb29e1f4d0d199e792a2a2e8fe67209d543644d6e8a6933"} Dec 08 20:25:56 crc kubenswrapper[4781]: I1208 20:25:56.039255 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0735293f-2f40-4edc-9184-ee889e5784d8","Type":"ContainerStarted","Data":"012e770e6eb1bf3f113292f415c802fe5812383daad0aecb6032e0de18b3fbd0"} Dec 08 20:25:56 crc kubenswrapper[4781]: I1208 20:25:56.051959 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.051940806 podStartE2EDuration="2.051940806s" podCreationTimestamp="2025-12-08 20:25:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:25:56.05031916 +0000 UTC m=+1272.201602537" watchObservedRunningTime="2025-12-08 20:25:56.051940806 +0000 UTC m=+1272.203224173" Dec 08 20:25:56 crc kubenswrapper[4781]: I1208 20:25:56.079237 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.07922087 podStartE2EDuration="2.07922087s" podCreationTimestamp="2025-12-08 20:25:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:25:56.074159075 +0000 UTC m=+1272.225442452" watchObservedRunningTime="2025-12-08 20:25:56.07922087 +0000 UTC m=+1272.230504247" Dec 08 20:25:56 crc kubenswrapper[4781]: I1208 20:25:56.137636 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c978c9d3-9d38-4185-bc0a-e454c11412cb" path="/var/lib/kubelet/pods/c978c9d3-9d38-4185-bc0a-e454c11412cb/volumes" Dec 08 20:25:59 crc kubenswrapper[4781]: I1208 20:25:59.366229 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 20:25:59 crc kubenswrapper[4781]: I1208 20:25:59.366967 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="2ca70357-2cfc-42dd-a9cf-2c5f992ba62d" containerName="kube-state-metrics" containerID="cri-o://66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98" gracePeriod=30 Dec 08 20:25:59 crc kubenswrapper[4781]: I1208 20:25:59.401047 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 08 20:25:59 crc kubenswrapper[4781]: I1208 20:25:59.525666 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 08 20:25:59 crc kubenswrapper[4781]: I1208 20:25:59.629275 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 08 20:25:59 crc kubenswrapper[4781]: I1208 20:25:59.629323 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 08 20:25:59 crc kubenswrapper[4781]: I1208 20:25:59.901016 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 08 20:25:59 crc kubenswrapper[4781]: I1208 20:25:59.948315 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:25:59 crc kubenswrapper[4781]: I1208 20:25:59.948431 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:25:59 crc kubenswrapper[4781]: I1208 20:25:59.948508 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:25:59 crc kubenswrapper[4781]: I1208 20:25:59.949783 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4fea80c9d2853786513b0b8aceae77577c3f9f5cebb3cd832d508e012c04f4da"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 20:25:59 crc kubenswrapper[4781]: I1208 20:25:59.949859 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://4fea80c9d2853786513b0b8aceae77577c3f9f5cebb3cd832d508e012c04f4da" gracePeriod=600 Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.013465 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d87bz\" (UniqueName: \"kubernetes.io/projected/2ca70357-2cfc-42dd-a9cf-2c5f992ba62d-kube-api-access-d87bz\") pod \"2ca70357-2cfc-42dd-a9cf-2c5f992ba62d\" (UID: \"2ca70357-2cfc-42dd-a9cf-2c5f992ba62d\") " Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.025346 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ca70357-2cfc-42dd-a9cf-2c5f992ba62d-kube-api-access-d87bz" (OuterVolumeSpecName: "kube-api-access-d87bz") pod "2ca70357-2cfc-42dd-a9cf-2c5f992ba62d" (UID: "2ca70357-2cfc-42dd-a9cf-2c5f992ba62d"). InnerVolumeSpecName "kube-api-access-d87bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.085700 4781 generic.go:334] "Generic (PLEG): container finished" podID="2ca70357-2cfc-42dd-a9cf-2c5f992ba62d" containerID="66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98" exitCode=2 Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.085741 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2ca70357-2cfc-42dd-a9cf-2c5f992ba62d","Type":"ContainerDied","Data":"66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98"} Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.085769 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2ca70357-2cfc-42dd-a9cf-2c5f992ba62d","Type":"ContainerDied","Data":"624f207844f6986613ec2519fde6b56c2758d417b48bb2dc53d007a4d728f4a7"} Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.085787 4781 scope.go:117] "RemoveContainer" containerID="66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.085909 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.115783 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d87bz\" (UniqueName: \"kubernetes.io/projected/2ca70357-2cfc-42dd-a9cf-2c5f992ba62d-kube-api-access-d87bz\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.167094 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.182287 4781 scope.go:117] "RemoveContainer" containerID="66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.182657 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 20:26:00 crc kubenswrapper[4781]: E1208 20:26:00.183475 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98\": container with ID starting with 66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98 not found: ID does not exist" containerID="66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.183523 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98"} err="failed to get container status \"66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98\": rpc error: code = NotFound desc = could not find container \"66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98\": container with ID starting with 66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98 not found: ID does not exist" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.194826 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 20:26:00 crc kubenswrapper[4781]: E1208 20:26:00.195267 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca70357-2cfc-42dd-a9cf-2c5f992ba62d" containerName="kube-state-metrics" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.195286 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca70357-2cfc-42dd-a9cf-2c5f992ba62d" containerName="kube-state-metrics" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.195499 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca70357-2cfc-42dd-a9cf-2c5f992ba62d" containerName="kube-state-metrics" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.196161 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.207376 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.207399 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.220018 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/acc0a735-094b-4857-8238-5240530c62dc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"acc0a735-094b-4857-8238-5240530c62dc\") " pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.220097 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb4b8\" (UniqueName: \"kubernetes.io/projected/acc0a735-094b-4857-8238-5240530c62dc-kube-api-access-hb4b8\") pod \"kube-state-metrics-0\" (UID: \"acc0a735-094b-4857-8238-5240530c62dc\") " pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.220120 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/acc0a735-094b-4857-8238-5240530c62dc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"acc0a735-094b-4857-8238-5240530c62dc\") " pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.220203 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc0a735-094b-4857-8238-5240530c62dc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"acc0a735-094b-4857-8238-5240530c62dc\") " pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.233087 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.321645 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/acc0a735-094b-4857-8238-5240530c62dc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"acc0a735-094b-4857-8238-5240530c62dc\") " pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.321792 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb4b8\" (UniqueName: \"kubernetes.io/projected/acc0a735-094b-4857-8238-5240530c62dc-kube-api-access-hb4b8\") pod \"kube-state-metrics-0\" (UID: \"acc0a735-094b-4857-8238-5240530c62dc\") " pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.321837 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/acc0a735-094b-4857-8238-5240530c62dc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"acc0a735-094b-4857-8238-5240530c62dc\") " pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.322008 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc0a735-094b-4857-8238-5240530c62dc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"acc0a735-094b-4857-8238-5240530c62dc\") " pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.327215 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/acc0a735-094b-4857-8238-5240530c62dc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"acc0a735-094b-4857-8238-5240530c62dc\") " pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.327772 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc0a735-094b-4857-8238-5240530c62dc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"acc0a735-094b-4857-8238-5240530c62dc\") " pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.328424 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/acc0a735-094b-4857-8238-5240530c62dc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"acc0a735-094b-4857-8238-5240530c62dc\") " pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.342420 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb4b8\" (UniqueName: \"kubernetes.io/projected/acc0a735-094b-4857-8238-5240530c62dc-kube-api-access-hb4b8\") pod \"kube-state-metrics-0\" (UID: \"acc0a735-094b-4857-8238-5240530c62dc\") " pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.536708 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.638155 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="895bd81e-c873-46ce-99cd-526084be1061" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 08 20:26:00 crc kubenswrapper[4781]: I1208 20:26:00.638154 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="895bd81e-c873-46ce-99cd-526084be1061" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 08 20:26:01 crc kubenswrapper[4781]: I1208 20:26:01.073232 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 08 20:26:01 crc kubenswrapper[4781]: I1208 20:26:01.079864 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 20:26:01 crc kubenswrapper[4781]: I1208 20:26:01.099684 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="4fea80c9d2853786513b0b8aceae77577c3f9f5cebb3cd832d508e012c04f4da" exitCode=0 Dec 08 20:26:01 crc kubenswrapper[4781]: I1208 20:26:01.099770 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"4fea80c9d2853786513b0b8aceae77577c3f9f5cebb3cd832d508e012c04f4da"} Dec 08 20:26:01 crc kubenswrapper[4781]: I1208 20:26:01.099801 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"a9c0ebca79495ed59d45aa666bc740d910c70a10325b1eaed4b6173c141374c9"} Dec 08 20:26:01 crc kubenswrapper[4781]: I1208 20:26:01.099821 4781 scope.go:117] "RemoveContainer" containerID="25df9dbfafc8a5164a8f6020132a91b9381bc39dadc7e73659feabffa41e871a" Dec 08 20:26:01 crc kubenswrapper[4781]: I1208 20:26:01.102736 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"acc0a735-094b-4857-8238-5240530c62dc","Type":"ContainerStarted","Data":"379c25429f0d2df6fee81cebcb531e0661e465afcc154096a6d0c3b8bd6584ea"} Dec 08 20:26:01 crc kubenswrapper[4781]: I1208 20:26:01.368125 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:26:01 crc kubenswrapper[4781]: I1208 20:26:01.368447 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="ceilometer-central-agent" containerID="cri-o://06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2" gracePeriod=30 Dec 08 20:26:01 crc kubenswrapper[4781]: I1208 20:26:01.368557 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="ceilometer-notification-agent" containerID="cri-o://78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa" gracePeriod=30 Dec 08 20:26:01 crc kubenswrapper[4781]: I1208 20:26:01.368585 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="sg-core" containerID="cri-o://f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887" gracePeriod=30 Dec 08 20:26:01 crc kubenswrapper[4781]: I1208 20:26:01.368538 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="proxy-httpd" containerID="cri-o://1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4" gracePeriod=30 Dec 08 20:26:02 crc kubenswrapper[4781]: I1208 20:26:02.114478 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"acc0a735-094b-4857-8238-5240530c62dc","Type":"ContainerStarted","Data":"97aaea65b34e0f6ff3857356e99b794c950b35d129b27a16a46cc8e003fc12ce"} Dec 08 20:26:02 crc kubenswrapper[4781]: I1208 20:26:02.114740 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 08 20:26:02 crc kubenswrapper[4781]: I1208 20:26:02.118948 4781 generic.go:334] "Generic (PLEG): container finished" podID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerID="1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4" exitCode=0 Dec 08 20:26:02 crc kubenswrapper[4781]: I1208 20:26:02.118973 4781 generic.go:334] "Generic (PLEG): container finished" podID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerID="f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887" exitCode=2 Dec 08 20:26:02 crc kubenswrapper[4781]: I1208 20:26:02.118986 4781 generic.go:334] "Generic (PLEG): container finished" podID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerID="06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2" exitCode=0 Dec 08 20:26:02 crc kubenswrapper[4781]: I1208 20:26:02.118986 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0bee9e1-3aa3-44d2-891e-65dbb171270b","Type":"ContainerDied","Data":"1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4"} Dec 08 20:26:02 crc kubenswrapper[4781]: I1208 20:26:02.119018 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0bee9e1-3aa3-44d2-891e-65dbb171270b","Type":"ContainerDied","Data":"f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887"} Dec 08 20:26:02 crc kubenswrapper[4781]: I1208 20:26:02.119029 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0bee9e1-3aa3-44d2-891e-65dbb171270b","Type":"ContainerDied","Data":"06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2"} Dec 08 20:26:02 crc kubenswrapper[4781]: I1208 20:26:02.137721 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.795815026 podStartE2EDuration="2.137696749s" podCreationTimestamp="2025-12-08 20:26:00 +0000 UTC" firstStartedPulling="2025-12-08 20:26:01.079598397 +0000 UTC m=+1277.230881774" lastFinishedPulling="2025-12-08 20:26:01.42148012 +0000 UTC m=+1277.572763497" observedRunningTime="2025-12-08 20:26:02.136270368 +0000 UTC m=+1278.287553785" watchObservedRunningTime="2025-12-08 20:26:02.137696749 +0000 UTC m=+1278.288980146" Dec 08 20:26:02 crc kubenswrapper[4781]: I1208 20:26:02.144337 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ca70357-2cfc-42dd-a9cf-2c5f992ba62d" path="/var/lib/kubelet/pods/2ca70357-2cfc-42dd-a9cf-2c5f992ba62d/volumes" Dec 08 20:26:04 crc kubenswrapper[4781]: I1208 20:26:04.401307 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 08 20:26:04 crc kubenswrapper[4781]: I1208 20:26:04.437803 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 08 20:26:04 crc kubenswrapper[4781]: I1208 20:26:04.706589 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 08 20:26:04 crc kubenswrapper[4781]: I1208 20:26:04.706641 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 08 20:26:05 crc kubenswrapper[4781]: I1208 20:26:05.218782 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 08 20:26:05 crc kubenswrapper[4781]: I1208 20:26:05.788185 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0735293f-2f40-4edc-9184-ee889e5784d8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 08 20:26:05 crc kubenswrapper[4781]: I1208 20:26:05.788239 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0735293f-2f40-4edc-9184-ee889e5784d8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 08 20:26:09 crc kubenswrapper[4781]: I1208 20:26:09.636810 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 08 20:26:09 crc kubenswrapper[4781]: I1208 20:26:09.642967 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 08 20:26:09 crc kubenswrapper[4781]: I1208 20:26:09.645442 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 08 20:26:10 crc kubenswrapper[4781]: I1208 20:26:10.291678 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 08 20:26:10 crc kubenswrapper[4781]: I1208 20:26:10.546434 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.146651 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.253638 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0bee9e1-3aa3-44d2-891e-65dbb171270b-run-httpd\") pod \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.254185 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0bee9e1-3aa3-44d2-891e-65dbb171270b-log-httpd\") pod \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.254247 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p5qx\" (UniqueName: \"kubernetes.io/projected/a0bee9e1-3aa3-44d2-891e-65dbb171270b-kube-api-access-9p5qx\") pod \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.254299 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-config-data\") pod \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.254347 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-scripts\") pod \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.254377 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-sg-core-conf-yaml\") pod \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.254429 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-combined-ca-bundle\") pod \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\" (UID: \"a0bee9e1-3aa3-44d2-891e-65dbb171270b\") " Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.254502 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0bee9e1-3aa3-44d2-891e-65dbb171270b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a0bee9e1-3aa3-44d2-891e-65dbb171270b" (UID: "a0bee9e1-3aa3-44d2-891e-65dbb171270b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.254637 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0bee9e1-3aa3-44d2-891e-65dbb171270b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a0bee9e1-3aa3-44d2-891e-65dbb171270b" (UID: "a0bee9e1-3aa3-44d2-891e-65dbb171270b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.255159 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0bee9e1-3aa3-44d2-891e-65dbb171270b-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.255182 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0bee9e1-3aa3-44d2-891e-65dbb171270b-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.269194 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-scripts" (OuterVolumeSpecName: "scripts") pod "a0bee9e1-3aa3-44d2-891e-65dbb171270b" (UID: "a0bee9e1-3aa3-44d2-891e-65dbb171270b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.279357 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0bee9e1-3aa3-44d2-891e-65dbb171270b-kube-api-access-9p5qx" (OuterVolumeSpecName: "kube-api-access-9p5qx") pod "a0bee9e1-3aa3-44d2-891e-65dbb171270b" (UID: "a0bee9e1-3aa3-44d2-891e-65dbb171270b"). InnerVolumeSpecName "kube-api-access-9p5qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.333411 4781 generic.go:334] "Generic (PLEG): container finished" podID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerID="78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa" exitCode=0 Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.334454 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.335014 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0bee9e1-3aa3-44d2-891e-65dbb171270b","Type":"ContainerDied","Data":"78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa"} Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.335098 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0bee9e1-3aa3-44d2-891e-65dbb171270b","Type":"ContainerDied","Data":"a47810d3fa8de74454b626e7fdbee857b61ec60498a9cb0d988969a4092d84b9"} Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.335121 4781 scope.go:117] "RemoveContainer" containerID="1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.357214 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p5qx\" (UniqueName: \"kubernetes.io/projected/a0bee9e1-3aa3-44d2-891e-65dbb171270b-kube-api-access-9p5qx\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.357281 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.386087 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a0bee9e1-3aa3-44d2-891e-65dbb171270b" (UID: "a0bee9e1-3aa3-44d2-891e-65dbb171270b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.433985 4781 scope.go:117] "RemoveContainer" containerID="f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.446220 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0bee9e1-3aa3-44d2-891e-65dbb171270b" (UID: "a0bee9e1-3aa3-44d2-891e-65dbb171270b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.458523 4781 scope.go:117] "RemoveContainer" containerID="78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.459806 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.459820 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.466954 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-config-data" (OuterVolumeSpecName: "config-data") pod "a0bee9e1-3aa3-44d2-891e-65dbb171270b" (UID: "a0bee9e1-3aa3-44d2-891e-65dbb171270b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.476621 4781 scope.go:117] "RemoveContainer" containerID="06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.496056 4781 scope.go:117] "RemoveContainer" containerID="1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4" Dec 08 20:26:11 crc kubenswrapper[4781]: E1208 20:26:11.496817 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4\": container with ID starting with 1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4 not found: ID does not exist" containerID="1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.496926 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4"} err="failed to get container status \"1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4\": rpc error: code = NotFound desc = could not find container \"1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4\": container with ID starting with 1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4 not found: ID does not exist" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.497079 4781 scope.go:117] "RemoveContainer" containerID="f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887" Dec 08 20:26:11 crc kubenswrapper[4781]: E1208 20:26:11.497572 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887\": container with ID starting with f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887 not found: ID does not exist" containerID="f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.497605 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887"} err="failed to get container status \"f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887\": rpc error: code = NotFound desc = could not find container \"f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887\": container with ID starting with f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887 not found: ID does not exist" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.497627 4781 scope.go:117] "RemoveContainer" containerID="78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa" Dec 08 20:26:11 crc kubenswrapper[4781]: E1208 20:26:11.497837 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa\": container with ID starting with 78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa not found: ID does not exist" containerID="78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.497854 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa"} err="failed to get container status \"78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa\": rpc error: code = NotFound desc = could not find container \"78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa\": container with ID starting with 78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa not found: ID does not exist" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.497865 4781 scope.go:117] "RemoveContainer" containerID="06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2" Dec 08 20:26:11 crc kubenswrapper[4781]: E1208 20:26:11.498255 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2\": container with ID starting with 06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2 not found: ID does not exist" containerID="06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.498274 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2"} err="failed to get container status \"06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2\": rpc error: code = NotFound desc = could not find container \"06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2\": container with ID starting with 06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2 not found: ID does not exist" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.561472 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0bee9e1-3aa3-44d2-891e-65dbb171270b-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.673057 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.680853 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.700202 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:26:11 crc kubenswrapper[4781]: E1208 20:26:11.700665 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="ceilometer-notification-agent" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.700685 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="ceilometer-notification-agent" Dec 08 20:26:11 crc kubenswrapper[4781]: E1208 20:26:11.700696 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="sg-core" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.700708 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="sg-core" Dec 08 20:26:11 crc kubenswrapper[4781]: E1208 20:26:11.700719 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="proxy-httpd" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.700727 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="proxy-httpd" Dec 08 20:26:11 crc kubenswrapper[4781]: E1208 20:26:11.700769 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="ceilometer-central-agent" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.700777 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="ceilometer-central-agent" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.701018 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="ceilometer-notification-agent" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.701044 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="sg-core" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.701066 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="proxy-httpd" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.701087 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" containerName="ceilometer-central-agent" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.703288 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.706135 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.706146 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.708743 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.721314 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.764970 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cc9130-8fda-46c9-acbf-6f73e99ffb32-log-httpd\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.765012 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-config-data\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.765161 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mfz2\" (UniqueName: \"kubernetes.io/projected/27cc9130-8fda-46c9-acbf-6f73e99ffb32-kube-api-access-8mfz2\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.765261 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.765324 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cc9130-8fda-46c9-acbf-6f73e99ffb32-run-httpd\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.765554 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.765642 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.765771 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-scripts\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: W1208 20:26:11.853093 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c19d492_b7b4_407a_80ed_e13d181ca557.slice/crio-af991339b5e6e268072b77a0c8e2e92c8d11ca93aba555511489f039c589f244 WatchSource:0}: Error finding container af991339b5e6e268072b77a0c8e2e92c8d11ca93aba555511489f039c589f244: Status 404 returned error can't find the container with id af991339b5e6e268072b77a0c8e2e92c8d11ca93aba555511489f039c589f244 Dec 08 20:26:11 crc kubenswrapper[4781]: W1208 20:26:11.853659 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c19d492_b7b4_407a_80ed_e13d181ca557.slice/crio-b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490.scope WatchSource:0}: Error finding container b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490: Status 404 returned error can't find the container with id b6e30b79dfad80898451d5a5df50a0f2581251c0a8d7c9ad457502a389de2490 Dec 08 20:26:11 crc kubenswrapper[4781]: W1208 20:26:11.854377 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c19d492_b7b4_407a_80ed_e13d181ca557.slice/crio-6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b.scope WatchSource:0}: Error finding container 6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b: Status 404 returned error can't find the container with id 6eba45256a620d529905e2ee47064743fa3688dedd51fbdd762aaa1611f2cc4b Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.887067 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.887179 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cc9130-8fda-46c9-acbf-6f73e99ffb32-run-httpd\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.887294 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.887421 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.887487 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-scripts\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.887635 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-config-data\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.887660 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cc9130-8fda-46c9-acbf-6f73e99ffb32-log-httpd\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.887775 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mfz2\" (UniqueName: \"kubernetes.io/projected/27cc9130-8fda-46c9-acbf-6f73e99ffb32-kube-api-access-8mfz2\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.889003 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cc9130-8fda-46c9-acbf-6f73e99ffb32-run-httpd\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.890270 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cc9130-8fda-46c9-acbf-6f73e99ffb32-log-httpd\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.894087 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.895041 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-config-data\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.896023 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.898522 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-scripts\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.903009 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:11 crc kubenswrapper[4781]: I1208 20:26:11.909525 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mfz2\" (UniqueName: \"kubernetes.io/projected/27cc9130-8fda-46c9-acbf-6f73e99ffb32-kube-api-access-8mfz2\") pod \"ceilometer-0\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " pod="openstack/ceilometer-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.020561 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:26:12 crc kubenswrapper[4781]: E1208 20:26:12.099434 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb4851cd_eb2c_49d7_927a_38b736ca36df.slice/crio-7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc978c9d3_9d38_4185_bc0a_e454c11412cb.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bee9e1_3aa3_44d2_891e_65dbb171270b.slice/crio-conmon-f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ca70357_2cfc_42dd_a9cf_2c5f992ba62d.slice/crio-624f207844f6986613ec2519fde6b56c2758d417b48bb2dc53d007a4d728f4a7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ca70357_2cfc_42dd_a9cf_2c5f992ba62d.slice/crio-conmon-66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bee9e1_3aa3_44d2_891e_65dbb171270b.slice/crio-conmon-1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bee9e1_3aa3_44d2_891e_65dbb171270b.slice/crio-conmon-78e7cbb08551a87ec916ac0fef50445e6a8f08656e9c1969b33b10b438049daa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bee9e1_3aa3_44d2_891e_65dbb171270b.slice/crio-conmon-06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ca70357_2cfc_42dd_a9cf_2c5f992ba62d.slice/crio-66ddc596ea99d9117c67454522c2d82bf70e0a42e3779e0c79e4cc95f8167c98.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f7ef82f_3bfa_4ebf_a7f0_b2d00fbcc6a8.slice/crio-4fea80c9d2853786513b0b8aceae77577c3f9f5cebb3cd832d508e012c04f4da.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb4851cd_eb2c_49d7_927a_38b736ca36df.slice/crio-conmon-7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bee9e1_3aa3_44d2_891e_65dbb171270b.slice/crio-a47810d3fa8de74454b626e7fdbee857b61ec60498a9cb0d988969a4092d84b9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f7ef82f_3bfa_4ebf_a7f0_b2d00fbcc6a8.slice/crio-conmon-4fea80c9d2853786513b0b8aceae77577c3f9f5cebb3cd832d508e012c04f4da.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bee9e1_3aa3_44d2_891e_65dbb171270b.slice/crio-06d6a60d7cc072b61367f58a2f8de89ef54c6bfd0564e24cfdbf3c0576f9bce2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bee9e1_3aa3_44d2_891e_65dbb171270b.slice/crio-1e51229fae5c7d89e2971e40f4db96dc4290429a6a4b4d4338eedad1644739d4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ca70357_2cfc_42dd_a9cf_2c5f992ba62d.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bee9e1_3aa3_44d2_891e_65dbb171270b.slice/crio-f2c46dba940f26c9195e37b95346aae97fe71bac6abc3cd9f6ff22b01447b887.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bee9e1_3aa3_44d2_891e_65dbb171270b.slice\": RecentStats: unable to find data in memory cache]" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.142771 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0bee9e1-3aa3-44d2-891e-65dbb171270b" path="/var/lib/kubelet/pods/a0bee9e1-3aa3-44d2-891e-65dbb171270b/volumes" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.166154 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.298610 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4851cd-eb2c-49d7-927a-38b736ca36df-config-data\") pod \"bb4851cd-eb2c-49d7-927a-38b736ca36df\" (UID: \"bb4851cd-eb2c-49d7-927a-38b736ca36df\") " Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.298726 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r552\" (UniqueName: \"kubernetes.io/projected/bb4851cd-eb2c-49d7-927a-38b736ca36df-kube-api-access-4r552\") pod \"bb4851cd-eb2c-49d7-927a-38b736ca36df\" (UID: \"bb4851cd-eb2c-49d7-927a-38b736ca36df\") " Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.298752 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4851cd-eb2c-49d7-927a-38b736ca36df-combined-ca-bundle\") pod \"bb4851cd-eb2c-49d7-927a-38b736ca36df\" (UID: \"bb4851cd-eb2c-49d7-927a-38b736ca36df\") " Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.304194 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb4851cd-eb2c-49d7-927a-38b736ca36df-kube-api-access-4r552" (OuterVolumeSpecName: "kube-api-access-4r552") pod "bb4851cd-eb2c-49d7-927a-38b736ca36df" (UID: "bb4851cd-eb2c-49d7-927a-38b736ca36df"). InnerVolumeSpecName "kube-api-access-4r552". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.325767 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4851cd-eb2c-49d7-927a-38b736ca36df-config-data" (OuterVolumeSpecName: "config-data") pod "bb4851cd-eb2c-49d7-927a-38b736ca36df" (UID: "bb4851cd-eb2c-49d7-927a-38b736ca36df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.330005 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4851cd-eb2c-49d7-927a-38b736ca36df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb4851cd-eb2c-49d7-927a-38b736ca36df" (UID: "bb4851cd-eb2c-49d7-927a-38b736ca36df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.345242 4781 generic.go:334] "Generic (PLEG): container finished" podID="bb4851cd-eb2c-49d7-927a-38b736ca36df" containerID="7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12" exitCode=137 Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.345304 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.345321 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bb4851cd-eb2c-49d7-927a-38b736ca36df","Type":"ContainerDied","Data":"7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12"} Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.345351 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"bb4851cd-eb2c-49d7-927a-38b736ca36df","Type":"ContainerDied","Data":"8439ae47cd14c26244e7064e2405fe5656569eae1f4578cf4f7695d214ed8049"} Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.345373 4781 scope.go:117] "RemoveContainer" containerID="7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.387547 4781 scope.go:117] "RemoveContainer" containerID="7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12" Dec 08 20:26:12 crc kubenswrapper[4781]: E1208 20:26:12.388273 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12\": container with ID starting with 7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12 not found: ID does not exist" containerID="7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.388337 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12"} err="failed to get container status \"7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12\": rpc error: code = NotFound desc = could not find container \"7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12\": container with ID starting with 7f1417b9dad6a5e627d82e51d833371f13a6c16f015b1ef443c9e1b94b80cc12 not found: ID does not exist" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.389190 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.400519 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4851cd-eb2c-49d7-927a-38b736ca36df-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.400552 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r552\" (UniqueName: \"kubernetes.io/projected/bb4851cd-eb2c-49d7-927a-38b736ca36df-kube-api-access-4r552\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.400565 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4851cd-eb2c-49d7-927a-38b736ca36df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.400527 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.422540 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 20:26:12 crc kubenswrapper[4781]: E1208 20:26:12.423043 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4851cd-eb2c-49d7-927a-38b736ca36df" containerName="nova-cell1-novncproxy-novncproxy" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.423064 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4851cd-eb2c-49d7-927a-38b736ca36df" containerName="nova-cell1-novncproxy-novncproxy" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.423293 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4851cd-eb2c-49d7-927a-38b736ca36df" containerName="nova-cell1-novncproxy-novncproxy" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.423901 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.428305 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.428559 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.428305 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.431645 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 20:26:12 crc kubenswrapper[4781]: W1208 20:26:12.473497 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27cc9130_8fda_46c9_acbf_6f73e99ffb32.slice/crio-d0e0c1670ba0210d0539b5214a0b2f0b762f67b328a8712b11987897ce279630 WatchSource:0}: Error finding container d0e0c1670ba0210d0539b5214a0b2f0b762f67b328a8712b11987897ce279630: Status 404 returned error can't find the container with id d0e0c1670ba0210d0539b5214a0b2f0b762f67b328a8712b11987897ce279630 Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.473795 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.502087 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28fbba80-b3c1-45f6-ad0a-3435a48fd033-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.502215 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv99n\" (UniqueName: \"kubernetes.io/projected/28fbba80-b3c1-45f6-ad0a-3435a48fd033-kube-api-access-nv99n\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.502280 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28fbba80-b3c1-45f6-ad0a-3435a48fd033-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.502315 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/28fbba80-b3c1-45f6-ad0a-3435a48fd033-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.502391 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/28fbba80-b3c1-45f6-ad0a-3435a48fd033-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.603722 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28fbba80-b3c1-45f6-ad0a-3435a48fd033-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.604145 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv99n\" (UniqueName: \"kubernetes.io/projected/28fbba80-b3c1-45f6-ad0a-3435a48fd033-kube-api-access-nv99n\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.604187 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28fbba80-b3c1-45f6-ad0a-3435a48fd033-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.604234 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/28fbba80-b3c1-45f6-ad0a-3435a48fd033-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.604291 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/28fbba80-b3c1-45f6-ad0a-3435a48fd033-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.608583 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28fbba80-b3c1-45f6-ad0a-3435a48fd033-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.608622 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28fbba80-b3c1-45f6-ad0a-3435a48fd033-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.609232 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/28fbba80-b3c1-45f6-ad0a-3435a48fd033-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.609817 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/28fbba80-b3c1-45f6-ad0a-3435a48fd033-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.621612 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv99n\" (UniqueName: \"kubernetes.io/projected/28fbba80-b3c1-45f6-ad0a-3435a48fd033-kube-api-access-nv99n\") pod \"nova-cell1-novncproxy-0\" (UID: \"28fbba80-b3c1-45f6-ad0a-3435a48fd033\") " pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:12 crc kubenswrapper[4781]: I1208 20:26:12.746809 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:13 crc kubenswrapper[4781]: I1208 20:26:13.206435 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 08 20:26:13 crc kubenswrapper[4781]: I1208 20:26:13.359034 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"28fbba80-b3c1-45f6-ad0a-3435a48fd033","Type":"ContainerStarted","Data":"600efbee30d387f90ab4bb3c479b94e3beda5c100ff0c0226f49665b8f0bde85"} Dec 08 20:26:13 crc kubenswrapper[4781]: I1208 20:26:13.360472 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27cc9130-8fda-46c9-acbf-6f73e99ffb32","Type":"ContainerStarted","Data":"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff"} Dec 08 20:26:13 crc kubenswrapper[4781]: I1208 20:26:13.360532 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27cc9130-8fda-46c9-acbf-6f73e99ffb32","Type":"ContainerStarted","Data":"d0e0c1670ba0210d0539b5214a0b2f0b762f67b328a8712b11987897ce279630"} Dec 08 20:26:14 crc kubenswrapper[4781]: I1208 20:26:14.139400 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb4851cd-eb2c-49d7-927a-38b736ca36df" path="/var/lib/kubelet/pods/bb4851cd-eb2c-49d7-927a-38b736ca36df/volumes" Dec 08 20:26:14 crc kubenswrapper[4781]: I1208 20:26:14.369760 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"28fbba80-b3c1-45f6-ad0a-3435a48fd033","Type":"ContainerStarted","Data":"00bddf3745f0810f6079c197937fb6f35bb8d2ac1c3c8d0bed642813d7273be0"} Dec 08 20:26:14 crc kubenswrapper[4781]: I1208 20:26:14.376812 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27cc9130-8fda-46c9-acbf-6f73e99ffb32","Type":"ContainerStarted","Data":"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184"} Dec 08 20:26:14 crc kubenswrapper[4781]: I1208 20:26:14.394799 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.394778633 podStartE2EDuration="2.394778633s" podCreationTimestamp="2025-12-08 20:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:26:14.386179376 +0000 UTC m=+1290.537462753" watchObservedRunningTime="2025-12-08 20:26:14.394778633 +0000 UTC m=+1290.546062030" Dec 08 20:26:14 crc kubenswrapper[4781]: I1208 20:26:14.712637 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 08 20:26:14 crc kubenswrapper[4781]: I1208 20:26:14.713675 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 08 20:26:14 crc kubenswrapper[4781]: I1208 20:26:14.718071 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 08 20:26:14 crc kubenswrapper[4781]: I1208 20:26:14.718190 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.387097 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27cc9130-8fda-46c9-acbf-6f73e99ffb32","Type":"ContainerStarted","Data":"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d"} Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.387882 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.392044 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.605983 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-4cltn"] Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.607621 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.629786 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-4cltn"] Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.673000 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.673047 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-dns-svc\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.673155 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.673188 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.673239 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6khxt\" (UniqueName: \"kubernetes.io/projected/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-kube-api-access-6khxt\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.673260 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-config\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.774991 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.775039 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.775084 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6khxt\" (UniqueName: \"kubernetes.io/projected/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-kube-api-access-6khxt\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.775102 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-config\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.775160 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.775175 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-dns-svc\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.776142 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-dns-svc\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.776447 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.776502 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.776506 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-config\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.776799 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.799569 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6khxt\" (UniqueName: \"kubernetes.io/projected/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-kube-api-access-6khxt\") pod \"dnsmasq-dns-5c9cbcb645-4cltn\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:15 crc kubenswrapper[4781]: I1208 20:26:15.933777 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:16 crc kubenswrapper[4781]: I1208 20:26:16.565605 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-4cltn"] Dec 08 20:26:17 crc kubenswrapper[4781]: I1208 20:26:17.406507 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27cc9130-8fda-46c9-acbf-6f73e99ffb32","Type":"ContainerStarted","Data":"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27"} Dec 08 20:26:17 crc kubenswrapper[4781]: I1208 20:26:17.406768 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 20:26:17 crc kubenswrapper[4781]: I1208 20:26:17.407974 4781 generic.go:334] "Generic (PLEG): container finished" podID="1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" containerID="f910a1a66581005ab1c91fccf2bada196fc44eeca6df8314e914f7c56dcabce0" exitCode=0 Dec 08 20:26:17 crc kubenswrapper[4781]: I1208 20:26:17.408072 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" event={"ID":"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93","Type":"ContainerDied","Data":"f910a1a66581005ab1c91fccf2bada196fc44eeca6df8314e914f7c56dcabce0"} Dec 08 20:26:17 crc kubenswrapper[4781]: I1208 20:26:17.408125 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" event={"ID":"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93","Type":"ContainerStarted","Data":"14a9dfd54e245b342d9f7f23a57db6510ce8990398dbaba8655f4177cbbace99"} Dec 08 20:26:17 crc kubenswrapper[4781]: I1208 20:26:17.450362 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7943099 podStartE2EDuration="6.450341899s" podCreationTimestamp="2025-12-08 20:26:11 +0000 UTC" firstStartedPulling="2025-12-08 20:26:12.475661541 +0000 UTC m=+1288.626944918" lastFinishedPulling="2025-12-08 20:26:16.13169354 +0000 UTC m=+1292.282976917" observedRunningTime="2025-12-08 20:26:17.432516896 +0000 UTC m=+1293.583800273" watchObservedRunningTime="2025-12-08 20:26:17.450341899 +0000 UTC m=+1293.601625276" Dec 08 20:26:17 crc kubenswrapper[4781]: I1208 20:26:17.747942 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:18 crc kubenswrapper[4781]: I1208 20:26:18.104633 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:26:18 crc kubenswrapper[4781]: I1208 20:26:18.417803 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" event={"ID":"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93","Type":"ContainerStarted","Data":"a1cb65ffc720e3f84360697dd46a416438be98fbc4daa7532ab08873844db00d"} Dec 08 20:26:18 crc kubenswrapper[4781]: I1208 20:26:18.417955 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0735293f-2f40-4edc-9184-ee889e5784d8" containerName="nova-api-log" containerID="cri-o://4903da7461dc5b6f5cb29e1f4d0d199e792a2a2e8fe67209d543644d6e8a6933" gracePeriod=30 Dec 08 20:26:18 crc kubenswrapper[4781]: I1208 20:26:18.418002 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0735293f-2f40-4edc-9184-ee889e5784d8" containerName="nova-api-api" containerID="cri-o://bab0a9d8a6109c212ee213dd324196287e50cde63fc6ce488cc329909bedc297" gracePeriod=30 Dec 08 20:26:18 crc kubenswrapper[4781]: I1208 20:26:18.438723 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" podStartSLOduration=3.438701197 podStartE2EDuration="3.438701197s" podCreationTimestamp="2025-12-08 20:26:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:26:18.437279517 +0000 UTC m=+1294.588562904" watchObservedRunningTime="2025-12-08 20:26:18.438701197 +0000 UTC m=+1294.589984574" Dec 08 20:26:19 crc kubenswrapper[4781]: I1208 20:26:19.261555 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:26:19 crc kubenswrapper[4781]: I1208 20:26:19.430101 4781 generic.go:334] "Generic (PLEG): container finished" podID="0735293f-2f40-4edc-9184-ee889e5784d8" containerID="4903da7461dc5b6f5cb29e1f4d0d199e792a2a2e8fe67209d543644d6e8a6933" exitCode=143 Dec 08 20:26:19 crc kubenswrapper[4781]: I1208 20:26:19.430191 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0735293f-2f40-4edc-9184-ee889e5784d8","Type":"ContainerDied","Data":"4903da7461dc5b6f5cb29e1f4d0d199e792a2a2e8fe67209d543644d6e8a6933"} Dec 08 20:26:19 crc kubenswrapper[4781]: I1208 20:26:19.430592 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="sg-core" containerID="cri-o://fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d" gracePeriod=30 Dec 08 20:26:19 crc kubenswrapper[4781]: I1208 20:26:19.430594 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="proxy-httpd" containerID="cri-o://8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27" gracePeriod=30 Dec 08 20:26:19 crc kubenswrapper[4781]: I1208 20:26:19.430597 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="ceilometer-notification-agent" containerID="cri-o://dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184" gracePeriod=30 Dec 08 20:26:19 crc kubenswrapper[4781]: I1208 20:26:19.430823 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:19 crc kubenswrapper[4781]: I1208 20:26:19.430491 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="ceilometer-central-agent" containerID="cri-o://4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff" gracePeriod=30 Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.336691 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.440425 4781 generic.go:334] "Generic (PLEG): container finished" podID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerID="8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27" exitCode=0 Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.440459 4781 generic.go:334] "Generic (PLEG): container finished" podID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerID="fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d" exitCode=2 Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.440467 4781 generic.go:334] "Generic (PLEG): container finished" podID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerID="dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184" exitCode=0 Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.440476 4781 generic.go:334] "Generic (PLEG): container finished" podID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerID="4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff" exitCode=0 Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.440508 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.440521 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27cc9130-8fda-46c9-acbf-6f73e99ffb32","Type":"ContainerDied","Data":"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27"} Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.440565 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27cc9130-8fda-46c9-acbf-6f73e99ffb32","Type":"ContainerDied","Data":"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d"} Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.440578 4781 scope.go:117] "RemoveContainer" containerID="8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.440578 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27cc9130-8fda-46c9-acbf-6f73e99ffb32","Type":"ContainerDied","Data":"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184"} Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.440737 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27cc9130-8fda-46c9-acbf-6f73e99ffb32","Type":"ContainerDied","Data":"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff"} Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.440760 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"27cc9130-8fda-46c9-acbf-6f73e99ffb32","Type":"ContainerDied","Data":"d0e0c1670ba0210d0539b5214a0b2f0b762f67b328a8712b11987897ce279630"} Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.463433 4781 scope.go:117] "RemoveContainer" containerID="fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.485044 4781 scope.go:117] "RemoveContainer" containerID="dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.506039 4781 scope.go:117] "RemoveContainer" containerID="4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.506495 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mfz2\" (UniqueName: \"kubernetes.io/projected/27cc9130-8fda-46c9-acbf-6f73e99ffb32-kube-api-access-8mfz2\") pod \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.506593 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cc9130-8fda-46c9-acbf-6f73e99ffb32-log-httpd\") pod \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.506628 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-sg-core-conf-yaml\") pod \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.506653 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-ceilometer-tls-certs\") pod \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.506728 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-combined-ca-bundle\") pod \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.506780 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-scripts\") pod \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.506948 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cc9130-8fda-46c9-acbf-6f73e99ffb32-run-httpd\") pod \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.506982 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-config-data\") pod \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\" (UID: \"27cc9130-8fda-46c9-acbf-6f73e99ffb32\") " Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.506998 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27cc9130-8fda-46c9-acbf-6f73e99ffb32-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "27cc9130-8fda-46c9-acbf-6f73e99ffb32" (UID: "27cc9130-8fda-46c9-acbf-6f73e99ffb32"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.507393 4781 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cc9130-8fda-46c9-acbf-6f73e99ffb32-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.508723 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27cc9130-8fda-46c9-acbf-6f73e99ffb32-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "27cc9130-8fda-46c9-acbf-6f73e99ffb32" (UID: "27cc9130-8fda-46c9-acbf-6f73e99ffb32"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.513767 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27cc9130-8fda-46c9-acbf-6f73e99ffb32-kube-api-access-8mfz2" (OuterVolumeSpecName: "kube-api-access-8mfz2") pod "27cc9130-8fda-46c9-acbf-6f73e99ffb32" (UID: "27cc9130-8fda-46c9-acbf-6f73e99ffb32"). InnerVolumeSpecName "kube-api-access-8mfz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.516185 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-scripts" (OuterVolumeSpecName: "scripts") pod "27cc9130-8fda-46c9-acbf-6f73e99ffb32" (UID: "27cc9130-8fda-46c9-acbf-6f73e99ffb32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.532345 4781 scope.go:117] "RemoveContainer" containerID="8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27" Dec 08 20:26:20 crc kubenswrapper[4781]: E1208 20:26:20.534395 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27\": container with ID starting with 8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27 not found: ID does not exist" containerID="8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.534457 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27"} err="failed to get container status \"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27\": rpc error: code = NotFound desc = could not find container \"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27\": container with ID starting with 8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27 not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.534485 4781 scope.go:117] "RemoveContainer" containerID="fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d" Dec 08 20:26:20 crc kubenswrapper[4781]: E1208 20:26:20.535616 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d\": container with ID starting with fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d not found: ID does not exist" containerID="fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.535642 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d"} err="failed to get container status \"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d\": rpc error: code = NotFound desc = could not find container \"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d\": container with ID starting with fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.535657 4781 scope.go:117] "RemoveContainer" containerID="dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184" Dec 08 20:26:20 crc kubenswrapper[4781]: E1208 20:26:20.536348 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184\": container with ID starting with dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184 not found: ID does not exist" containerID="dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.536459 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184"} err="failed to get container status \"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184\": rpc error: code = NotFound desc = could not find container \"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184\": container with ID starting with dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184 not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.536484 4781 scope.go:117] "RemoveContainer" containerID="4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff" Dec 08 20:26:20 crc kubenswrapper[4781]: E1208 20:26:20.537074 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff\": container with ID starting with 4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff not found: ID does not exist" containerID="4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.537114 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff"} err="failed to get container status \"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff\": rpc error: code = NotFound desc = could not find container \"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff\": container with ID starting with 4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.537138 4781 scope.go:117] "RemoveContainer" containerID="8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.537389 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27"} err="failed to get container status \"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27\": rpc error: code = NotFound desc = could not find container \"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27\": container with ID starting with 8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27 not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.537410 4781 scope.go:117] "RemoveContainer" containerID="fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.537625 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d"} err="failed to get container status \"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d\": rpc error: code = NotFound desc = could not find container \"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d\": container with ID starting with fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.537642 4781 scope.go:117] "RemoveContainer" containerID="dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.537816 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184"} err="failed to get container status \"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184\": rpc error: code = NotFound desc = could not find container \"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184\": container with ID starting with dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184 not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.537833 4781 scope.go:117] "RemoveContainer" containerID="4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.538133 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff"} err="failed to get container status \"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff\": rpc error: code = NotFound desc = could not find container \"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff\": container with ID starting with 4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.538171 4781 scope.go:117] "RemoveContainer" containerID="8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.538458 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27"} err="failed to get container status \"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27\": rpc error: code = NotFound desc = could not find container \"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27\": container with ID starting with 8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27 not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.538493 4781 scope.go:117] "RemoveContainer" containerID="fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.538738 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d"} err="failed to get container status \"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d\": rpc error: code = NotFound desc = could not find container \"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d\": container with ID starting with fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.538762 4781 scope.go:117] "RemoveContainer" containerID="dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.539555 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184"} err="failed to get container status \"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184\": rpc error: code = NotFound desc = could not find container \"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184\": container with ID starting with dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184 not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.539579 4781 scope.go:117] "RemoveContainer" containerID="4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.539848 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff"} err="failed to get container status \"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff\": rpc error: code = NotFound desc = could not find container \"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff\": container with ID starting with 4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.539900 4781 scope.go:117] "RemoveContainer" containerID="8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.540242 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27"} err="failed to get container status \"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27\": rpc error: code = NotFound desc = could not find container \"8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27\": container with ID starting with 8ef9aa656bbc656b6a2513d5f330ad03c0f109c363e25c89afa454e4cbecad27 not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.540266 4781 scope.go:117] "RemoveContainer" containerID="fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.540444 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d"} err="failed to get container status \"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d\": rpc error: code = NotFound desc = could not find container \"fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d\": container with ID starting with fb99957812bea236856f2521d2f6ac109aef6591b279e98fb87a237c0cbaeb1d not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.540456 4781 scope.go:117] "RemoveContainer" containerID="dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.540611 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184"} err="failed to get container status \"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184\": rpc error: code = NotFound desc = could not find container \"dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184\": container with ID starting with dbbf32a413e887b3987cdb80c806fff937ef820fca02e4aa11d42dff9c55f184 not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.540625 4781 scope.go:117] "RemoveContainer" containerID="4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.540838 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff"} err="failed to get container status \"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff\": rpc error: code = NotFound desc = could not find container \"4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff\": container with ID starting with 4b6ea0d4ae00095aaac640369ce6862e478e5c5350e61ced07e572c7049fc6ff not found: ID does not exist" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.549144 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "27cc9130-8fda-46c9-acbf-6f73e99ffb32" (UID: "27cc9130-8fda-46c9-acbf-6f73e99ffb32"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.586841 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "27cc9130-8fda-46c9-acbf-6f73e99ffb32" (UID: "27cc9130-8fda-46c9-acbf-6f73e99ffb32"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.609261 4781 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.609302 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.609318 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.609357 4781 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/27cc9130-8fda-46c9-acbf-6f73e99ffb32-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.609381 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mfz2\" (UniqueName: \"kubernetes.io/projected/27cc9130-8fda-46c9-acbf-6f73e99ffb32-kube-api-access-8mfz2\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.626453 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27cc9130-8fda-46c9-acbf-6f73e99ffb32" (UID: "27cc9130-8fda-46c9-acbf-6f73e99ffb32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.629692 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-config-data" (OuterVolumeSpecName: "config-data") pod "27cc9130-8fda-46c9-acbf-6f73e99ffb32" (UID: "27cc9130-8fda-46c9-acbf-6f73e99ffb32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.711289 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.711327 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27cc9130-8fda-46c9-acbf-6f73e99ffb32-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.773961 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.782231 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.799834 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:26:20 crc kubenswrapper[4781]: E1208 20:26:20.800280 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="proxy-httpd" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.800302 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="proxy-httpd" Dec 08 20:26:20 crc kubenswrapper[4781]: E1208 20:26:20.800313 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="ceilometer-notification-agent" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.800322 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="ceilometer-notification-agent" Dec 08 20:26:20 crc kubenswrapper[4781]: E1208 20:26:20.800345 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="sg-core" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.800352 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="sg-core" Dec 08 20:26:20 crc kubenswrapper[4781]: E1208 20:26:20.800374 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="ceilometer-central-agent" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.800381 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="ceilometer-central-agent" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.800566 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="ceilometer-notification-agent" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.800586 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="ceilometer-central-agent" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.800594 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="sg-core" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.800609 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" containerName="proxy-httpd" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.802658 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.805396 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.805468 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.805563 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.826255 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.914984 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-scripts\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.915054 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.915089 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-config-data\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.915140 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxcdg\" (UniqueName: \"kubernetes.io/projected/9447ae7c-50db-46a0-aeec-7718944d900e-kube-api-access-dxcdg\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.915159 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9447ae7c-50db-46a0-aeec-7718944d900e-log-httpd\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.915199 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9447ae7c-50db-46a0-aeec-7718944d900e-run-httpd\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.915223 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:20 crc kubenswrapper[4781]: I1208 20:26:20.915237 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.016662 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxcdg\" (UniqueName: \"kubernetes.io/projected/9447ae7c-50db-46a0-aeec-7718944d900e-kube-api-access-dxcdg\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.016721 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9447ae7c-50db-46a0-aeec-7718944d900e-log-httpd\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.016794 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9447ae7c-50db-46a0-aeec-7718944d900e-run-httpd\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.016837 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.016860 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.016958 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-scripts\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.017003 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.017036 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-config-data\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.017455 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9447ae7c-50db-46a0-aeec-7718944d900e-log-httpd\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.017480 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9447ae7c-50db-46a0-aeec-7718944d900e-run-httpd\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.024981 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.025010 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.025022 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.025001 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-scripts\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.032514 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9447ae7c-50db-46a0-aeec-7718944d900e-config-data\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.035024 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxcdg\" (UniqueName: \"kubernetes.io/projected/9447ae7c-50db-46a0-aeec-7718944d900e-kube-api-access-dxcdg\") pod \"ceilometer-0\" (UID: \"9447ae7c-50db-46a0-aeec-7718944d900e\") " pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.120599 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 08 20:26:21 crc kubenswrapper[4781]: I1208 20:26:21.606686 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 08 20:26:21 crc kubenswrapper[4781]: W1208 20:26:21.666347 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9447ae7c_50db_46a0_aeec_7718944d900e.slice/crio-25a4ca6d7a90dbc784f2dd1525520fdaac4c32e38ac227357fbbf3f13a0caf61 WatchSource:0}: Error finding container 25a4ca6d7a90dbc784f2dd1525520fdaac4c32e38ac227357fbbf3f13a0caf61: Status 404 returned error can't find the container with id 25a4ca6d7a90dbc784f2dd1525520fdaac4c32e38ac227357fbbf3f13a0caf61 Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.026997 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.141384 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0735293f-2f40-4edc-9184-ee889e5784d8-logs\") pod \"0735293f-2f40-4edc-9184-ee889e5784d8\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.141439 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d67dr\" (UniqueName: \"kubernetes.io/projected/0735293f-2f40-4edc-9184-ee889e5784d8-kube-api-access-d67dr\") pod \"0735293f-2f40-4edc-9184-ee889e5784d8\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.141510 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0735293f-2f40-4edc-9184-ee889e5784d8-combined-ca-bundle\") pod \"0735293f-2f40-4edc-9184-ee889e5784d8\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.141550 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0735293f-2f40-4edc-9184-ee889e5784d8-config-data\") pod \"0735293f-2f40-4edc-9184-ee889e5784d8\" (UID: \"0735293f-2f40-4edc-9184-ee889e5784d8\") " Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.142676 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0735293f-2f40-4edc-9184-ee889e5784d8-logs" (OuterVolumeSpecName: "logs") pod "0735293f-2f40-4edc-9184-ee889e5784d8" (UID: "0735293f-2f40-4edc-9184-ee889e5784d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.145703 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0735293f-2f40-4edc-9184-ee889e5784d8-kube-api-access-d67dr" (OuterVolumeSpecName: "kube-api-access-d67dr") pod "0735293f-2f40-4edc-9184-ee889e5784d8" (UID: "0735293f-2f40-4edc-9184-ee889e5784d8"). InnerVolumeSpecName "kube-api-access-d67dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.154863 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27cc9130-8fda-46c9-acbf-6f73e99ffb32" path="/var/lib/kubelet/pods/27cc9130-8fda-46c9-acbf-6f73e99ffb32/volumes" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.182757 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0735293f-2f40-4edc-9184-ee889e5784d8-config-data" (OuterVolumeSpecName: "config-data") pod "0735293f-2f40-4edc-9184-ee889e5784d8" (UID: "0735293f-2f40-4edc-9184-ee889e5784d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.208073 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0735293f-2f40-4edc-9184-ee889e5784d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0735293f-2f40-4edc-9184-ee889e5784d8" (UID: "0735293f-2f40-4edc-9184-ee889e5784d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.245747 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0735293f-2f40-4edc-9184-ee889e5784d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.245780 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0735293f-2f40-4edc-9184-ee889e5784d8-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.245792 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0735293f-2f40-4edc-9184-ee889e5784d8-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.245805 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d67dr\" (UniqueName: \"kubernetes.io/projected/0735293f-2f40-4edc-9184-ee889e5784d8-kube-api-access-d67dr\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.465197 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9447ae7c-50db-46a0-aeec-7718944d900e","Type":"ContainerStarted","Data":"dc38a079da6923eaebec75167c50dc2202b887ffff25805eb76285397c3c5f99"} Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.465248 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9447ae7c-50db-46a0-aeec-7718944d900e","Type":"ContainerStarted","Data":"25a4ca6d7a90dbc784f2dd1525520fdaac4c32e38ac227357fbbf3f13a0caf61"} Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.467634 4781 generic.go:334] "Generic (PLEG): container finished" podID="0735293f-2f40-4edc-9184-ee889e5784d8" containerID="bab0a9d8a6109c212ee213dd324196287e50cde63fc6ce488cc329909bedc297" exitCode=0 Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.467674 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0735293f-2f40-4edc-9184-ee889e5784d8","Type":"ContainerDied","Data":"bab0a9d8a6109c212ee213dd324196287e50cde63fc6ce488cc329909bedc297"} Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.467699 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0735293f-2f40-4edc-9184-ee889e5784d8","Type":"ContainerDied","Data":"012e770e6eb1bf3f113292f415c802fe5812383daad0aecb6032e0de18b3fbd0"} Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.467727 4781 scope.go:117] "RemoveContainer" containerID="bab0a9d8a6109c212ee213dd324196287e50cde63fc6ce488cc329909bedc297" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.467846 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.500604 4781 scope.go:117] "RemoveContainer" containerID="4903da7461dc5b6f5cb29e1f4d0d199e792a2a2e8fe67209d543644d6e8a6933" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.522495 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.537000 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.544635 4781 scope.go:117] "RemoveContainer" containerID="bab0a9d8a6109c212ee213dd324196287e50cde63fc6ce488cc329909bedc297" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.546567 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 08 20:26:22 crc kubenswrapper[4781]: E1208 20:26:22.546970 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab0a9d8a6109c212ee213dd324196287e50cde63fc6ce488cc329909bedc297\": container with ID starting with bab0a9d8a6109c212ee213dd324196287e50cde63fc6ce488cc329909bedc297 not found: ID does not exist" containerID="bab0a9d8a6109c212ee213dd324196287e50cde63fc6ce488cc329909bedc297" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.547013 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab0a9d8a6109c212ee213dd324196287e50cde63fc6ce488cc329909bedc297"} err="failed to get container status \"bab0a9d8a6109c212ee213dd324196287e50cde63fc6ce488cc329909bedc297\": rpc error: code = NotFound desc = could not find container \"bab0a9d8a6109c212ee213dd324196287e50cde63fc6ce488cc329909bedc297\": container with ID starting with bab0a9d8a6109c212ee213dd324196287e50cde63fc6ce488cc329909bedc297 not found: ID does not exist" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.547037 4781 scope.go:117] "RemoveContainer" containerID="4903da7461dc5b6f5cb29e1f4d0d199e792a2a2e8fe67209d543644d6e8a6933" Dec 08 20:26:22 crc kubenswrapper[4781]: E1208 20:26:22.547039 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0735293f-2f40-4edc-9184-ee889e5784d8" containerName="nova-api-api" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.547056 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0735293f-2f40-4edc-9184-ee889e5784d8" containerName="nova-api-api" Dec 08 20:26:22 crc kubenswrapper[4781]: E1208 20:26:22.547080 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0735293f-2f40-4edc-9184-ee889e5784d8" containerName="nova-api-log" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.547085 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0735293f-2f40-4edc-9184-ee889e5784d8" containerName="nova-api-log" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.547284 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0735293f-2f40-4edc-9184-ee889e5784d8" containerName="nova-api-log" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.547301 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0735293f-2f40-4edc-9184-ee889e5784d8" containerName="nova-api-api" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.548293 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: E1208 20:26:22.549227 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4903da7461dc5b6f5cb29e1f4d0d199e792a2a2e8fe67209d543644d6e8a6933\": container with ID starting with 4903da7461dc5b6f5cb29e1f4d0d199e792a2a2e8fe67209d543644d6e8a6933 not found: ID does not exist" containerID="4903da7461dc5b6f5cb29e1f4d0d199e792a2a2e8fe67209d543644d6e8a6933" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.549265 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4903da7461dc5b6f5cb29e1f4d0d199e792a2a2e8fe67209d543644d6e8a6933"} err="failed to get container status \"4903da7461dc5b6f5cb29e1f4d0d199e792a2a2e8fe67209d543644d6e8a6933\": rpc error: code = NotFound desc = could not find container \"4903da7461dc5b6f5cb29e1f4d0d199e792a2a2e8fe67209d543644d6e8a6933\": container with ID starting with 4903da7461dc5b6f5cb29e1f4d0d199e792a2a2e8fe67209d543644d6e8a6933 not found: ID does not exist" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.558046 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.572122 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.572358 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.572593 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.653500 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-logs\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.653589 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rn99\" (UniqueName: \"kubernetes.io/projected/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-kube-api-access-2rn99\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.653624 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-config-data\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.653695 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.653718 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-public-tls-certs\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.653784 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.747983 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.757427 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.757534 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-logs\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.757580 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rn99\" (UniqueName: \"kubernetes.io/projected/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-kube-api-access-2rn99\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.757608 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-config-data\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.757671 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.757694 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-public-tls-certs\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.758788 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-logs\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.765177 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.766847 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-public-tls-certs\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.773671 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.776297 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-config-data\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.799605 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rn99\" (UniqueName: \"kubernetes.io/projected/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-kube-api-access-2rn99\") pod \"nova-api-0\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " pod="openstack/nova-api-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.847133 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:22 crc kubenswrapper[4781]: I1208 20:26:22.976323 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:26:23 crc kubenswrapper[4781]: I1208 20:26:23.486696 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9447ae7c-50db-46a0-aeec-7718944d900e","Type":"ContainerStarted","Data":"e4b5d6085989261506bf6ca7eaa6f18460650228fb8ac29f601c57b8d4604ac7"} Dec 08 20:26:23 crc kubenswrapper[4781]: I1208 20:26:23.532997 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 08 20:26:23 crc kubenswrapper[4781]: I1208 20:26:23.636778 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:26:23 crc kubenswrapper[4781]: I1208 20:26:23.752883 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-fq7lx"] Dec 08 20:26:23 crc kubenswrapper[4781]: I1208 20:26:23.754150 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:23 crc kubenswrapper[4781]: I1208 20:26:23.760276 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 08 20:26:23 crc kubenswrapper[4781]: I1208 20:26:23.760476 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 08 20:26:23 crc kubenswrapper[4781]: I1208 20:26:23.763744 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fq7lx"] Dec 08 20:26:23 crc kubenswrapper[4781]: I1208 20:26:23.896246 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-config-data\") pod \"nova-cell1-cell-mapping-fq7lx\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:23 crc kubenswrapper[4781]: I1208 20:26:23.896993 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fq7lx\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:23 crc kubenswrapper[4781]: I1208 20:26:23.897109 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-scripts\") pod \"nova-cell1-cell-mapping-fq7lx\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:23 crc kubenswrapper[4781]: I1208 20:26:23.897601 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4wc5\" (UniqueName: \"kubernetes.io/projected/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-kube-api-access-q4wc5\") pod \"nova-cell1-cell-mapping-fq7lx\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:23 crc kubenswrapper[4781]: I1208 20:26:23.999823 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fq7lx\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:23.999871 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-scripts\") pod \"nova-cell1-cell-mapping-fq7lx\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:24.000000 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4wc5\" (UniqueName: \"kubernetes.io/projected/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-kube-api-access-q4wc5\") pod \"nova-cell1-cell-mapping-fq7lx\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:24.000043 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-config-data\") pod \"nova-cell1-cell-mapping-fq7lx\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:24.004848 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-scripts\") pod \"nova-cell1-cell-mapping-fq7lx\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:24.008290 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fq7lx\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:24.008959 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-config-data\") pod \"nova-cell1-cell-mapping-fq7lx\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:24.019315 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4wc5\" (UniqueName: \"kubernetes.io/projected/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-kube-api-access-q4wc5\") pod \"nova-cell1-cell-mapping-fq7lx\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:24.114716 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:24.140754 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0735293f-2f40-4edc-9184-ee889e5784d8" path="/var/lib/kubelet/pods/0735293f-2f40-4edc-9184-ee889e5784d8/volumes" Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:24.501134 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7","Type":"ContainerStarted","Data":"6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294"} Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:24.501462 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7","Type":"ContainerStarted","Data":"cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e"} Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:24.501473 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7","Type":"ContainerStarted","Data":"e52237e7f261d482b720e49a8a60500fc414462bea205e1252bfa22baf9f08a7"} Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:24.537306 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5372799969999997 podStartE2EDuration="2.537279997s" podCreationTimestamp="2025-12-08 20:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:26:24.519593839 +0000 UTC m=+1300.670877226" watchObservedRunningTime="2025-12-08 20:26:24.537279997 +0000 UTC m=+1300.688563374" Dec 08 20:26:24 crc kubenswrapper[4781]: I1208 20:26:24.663328 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fq7lx"] Dec 08 20:26:24 crc kubenswrapper[4781]: W1208 20:26:24.667811 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d4f9f6a_d72d_41c6_92d1_7b90bdd35123.slice/crio-9303c5823e63c6f97c50d628f3d352fb2122c373ad5ce93d67b1a5610d420d67 WatchSource:0}: Error finding container 9303c5823e63c6f97c50d628f3d352fb2122c373ad5ce93d67b1a5610d420d67: Status 404 returned error can't find the container with id 9303c5823e63c6f97c50d628f3d352fb2122c373ad5ce93d67b1a5610d420d67 Dec 08 20:26:25 crc kubenswrapper[4781]: I1208 20:26:25.516391 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9447ae7c-50db-46a0-aeec-7718944d900e","Type":"ContainerStarted","Data":"8e2e171e34af33def9c444c14803eb0ffde810ea13c7d35294a4e4c760047179"} Dec 08 20:26:25 crc kubenswrapper[4781]: I1208 20:26:25.520475 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fq7lx" event={"ID":"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123","Type":"ContainerStarted","Data":"26c7fb32a9c94a7dbfda9ab0b2feccf37c902135b5a6b9566c3039906bbb3aef"} Dec 08 20:26:25 crc kubenswrapper[4781]: I1208 20:26:25.520530 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fq7lx" event={"ID":"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123","Type":"ContainerStarted","Data":"9303c5823e63c6f97c50d628f3d352fb2122c373ad5ce93d67b1a5610d420d67"} Dec 08 20:26:25 crc kubenswrapper[4781]: I1208 20:26:25.542916 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-fq7lx" podStartSLOduration=2.542898263 podStartE2EDuration="2.542898263s" podCreationTimestamp="2025-12-08 20:26:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:26:25.532611908 +0000 UTC m=+1301.683895295" watchObservedRunningTime="2025-12-08 20:26:25.542898263 +0000 UTC m=+1301.694181640" Dec 08 20:26:25 crc kubenswrapper[4781]: I1208 20:26:25.936228 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.005309 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-hf9gt"] Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.005569 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" podUID="89c523b0-35de-4752-88b4-3493cf0502b4" containerName="dnsmasq-dns" containerID="cri-o://5daf54665f713e6ff5780d763f9e11b552ceb9574cd092749fbe79136e8bf76d" gracePeriod=10 Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.531875 4781 generic.go:334] "Generic (PLEG): container finished" podID="89c523b0-35de-4752-88b4-3493cf0502b4" containerID="5daf54665f713e6ff5780d763f9e11b552ceb9574cd092749fbe79136e8bf76d" exitCode=0 Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.531963 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" event={"ID":"89c523b0-35de-4752-88b4-3493cf0502b4","Type":"ContainerDied","Data":"5daf54665f713e6ff5780d763f9e11b552ceb9574cd092749fbe79136e8bf76d"} Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.541907 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9447ae7c-50db-46a0-aeec-7718944d900e","Type":"ContainerStarted","Data":"4efd9f50994694c34543de441166fed719ef6ffd8b6150cbf071c38727c36a93"} Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.542321 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.592732 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.4912211859999998 podStartE2EDuration="6.592716699s" podCreationTimestamp="2025-12-08 20:26:20 +0000 UTC" firstStartedPulling="2025-12-08 20:26:21.668962296 +0000 UTC m=+1297.820245674" lastFinishedPulling="2025-12-08 20:26:25.77045781 +0000 UTC m=+1301.921741187" observedRunningTime="2025-12-08 20:26:26.579288433 +0000 UTC m=+1302.730571810" watchObservedRunningTime="2025-12-08 20:26:26.592716699 +0000 UTC m=+1302.744000076" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.698015 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.771670 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-config\") pod \"89c523b0-35de-4752-88b4-3493cf0502b4\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.771718 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbh5g\" (UniqueName: \"kubernetes.io/projected/89c523b0-35de-4752-88b4-3493cf0502b4-kube-api-access-bbh5g\") pod \"89c523b0-35de-4752-88b4-3493cf0502b4\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.771744 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-ovsdbserver-nb\") pod \"89c523b0-35de-4752-88b4-3493cf0502b4\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.771778 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-dns-svc\") pod \"89c523b0-35de-4752-88b4-3493cf0502b4\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.771832 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-ovsdbserver-sb\") pod \"89c523b0-35de-4752-88b4-3493cf0502b4\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.771916 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-dns-swift-storage-0\") pod \"89c523b0-35de-4752-88b4-3493cf0502b4\" (UID: \"89c523b0-35de-4752-88b4-3493cf0502b4\") " Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.776891 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c523b0-35de-4752-88b4-3493cf0502b4-kube-api-access-bbh5g" (OuterVolumeSpecName: "kube-api-access-bbh5g") pod "89c523b0-35de-4752-88b4-3493cf0502b4" (UID: "89c523b0-35de-4752-88b4-3493cf0502b4"). InnerVolumeSpecName "kube-api-access-bbh5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.836530 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "89c523b0-35de-4752-88b4-3493cf0502b4" (UID: "89c523b0-35de-4752-88b4-3493cf0502b4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.837909 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "89c523b0-35de-4752-88b4-3493cf0502b4" (UID: "89c523b0-35de-4752-88b4-3493cf0502b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.846363 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "89c523b0-35de-4752-88b4-3493cf0502b4" (UID: "89c523b0-35de-4752-88b4-3493cf0502b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.846699 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-config" (OuterVolumeSpecName: "config") pod "89c523b0-35de-4752-88b4-3493cf0502b4" (UID: "89c523b0-35de-4752-88b4-3493cf0502b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.853519 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "89c523b0-35de-4752-88b4-3493cf0502b4" (UID: "89c523b0-35de-4752-88b4-3493cf0502b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.874550 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.874597 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.874611 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbh5g\" (UniqueName: \"kubernetes.io/projected/89c523b0-35de-4752-88b4-3493cf0502b4-kube-api-access-bbh5g\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.874625 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.874637 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:26 crc kubenswrapper[4781]: I1208 20:26:26.874689 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/89c523b0-35de-4752-88b4-3493cf0502b4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:27 crc kubenswrapper[4781]: I1208 20:26:27.551540 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" event={"ID":"89c523b0-35de-4752-88b4-3493cf0502b4","Type":"ContainerDied","Data":"9b5eb9c74ab697e3aa1a39439ccc11db54a293fbb0917d48cc2ed7695435c2bb"} Dec 08 20:26:27 crc kubenswrapper[4781]: I1208 20:26:27.551594 4781 scope.go:117] "RemoveContainer" containerID="5daf54665f713e6ff5780d763f9e11b552ceb9574cd092749fbe79136e8bf76d" Dec 08 20:26:27 crc kubenswrapper[4781]: I1208 20:26:27.551626 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" Dec 08 20:26:27 crc kubenswrapper[4781]: I1208 20:26:27.578393 4781 scope.go:117] "RemoveContainer" containerID="f634f19c9216173a4016c40deebba4c5235d55d6b0fe28cd68af8bcdefe99281" Dec 08 20:26:27 crc kubenswrapper[4781]: I1208 20:26:27.594247 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-hf9gt"] Dec 08 20:26:27 crc kubenswrapper[4781]: I1208 20:26:27.604733 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-hf9gt"] Dec 08 20:26:28 crc kubenswrapper[4781]: I1208 20:26:28.137738 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89c523b0-35de-4752-88b4-3493cf0502b4" path="/var/lib/kubelet/pods/89c523b0-35de-4752-88b4-3493cf0502b4/volumes" Dec 08 20:26:30 crc kubenswrapper[4781]: I1208 20:26:30.603178 4781 generic.go:334] "Generic (PLEG): container finished" podID="5d4f9f6a-d72d-41c6-92d1-7b90bdd35123" containerID="26c7fb32a9c94a7dbfda9ab0b2feccf37c902135b5a6b9566c3039906bbb3aef" exitCode=0 Dec 08 20:26:30 crc kubenswrapper[4781]: I1208 20:26:30.603267 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fq7lx" event={"ID":"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123","Type":"ContainerDied","Data":"26c7fb32a9c94a7dbfda9ab0b2feccf37c902135b5a6b9566c3039906bbb3aef"} Dec 08 20:26:31 crc kubenswrapper[4781]: I1208 20:26:31.678053 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c4475fdfc-hf9gt" podUID="89c523b0-35de-4752-88b4-3493cf0502b4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.188:5353: i/o timeout" Dec 08 20:26:31 crc kubenswrapper[4781]: I1208 20:26:31.971117 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.078510 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4wc5\" (UniqueName: \"kubernetes.io/projected/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-kube-api-access-q4wc5\") pod \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.078684 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-config-data\") pod \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.078739 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-scripts\") pod \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.078800 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-combined-ca-bundle\") pod \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\" (UID: \"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123\") " Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.085104 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-scripts" (OuterVolumeSpecName: "scripts") pod "5d4f9f6a-d72d-41c6-92d1-7b90bdd35123" (UID: "5d4f9f6a-d72d-41c6-92d1-7b90bdd35123"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.085123 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-kube-api-access-q4wc5" (OuterVolumeSpecName: "kube-api-access-q4wc5") pod "5d4f9f6a-d72d-41c6-92d1-7b90bdd35123" (UID: "5d4f9f6a-d72d-41c6-92d1-7b90bdd35123"). InnerVolumeSpecName "kube-api-access-q4wc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.109878 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-config-data" (OuterVolumeSpecName: "config-data") pod "5d4f9f6a-d72d-41c6-92d1-7b90bdd35123" (UID: "5d4f9f6a-d72d-41c6-92d1-7b90bdd35123"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.112080 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d4f9f6a-d72d-41c6-92d1-7b90bdd35123" (UID: "5d4f9f6a-d72d-41c6-92d1-7b90bdd35123"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.181629 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4wc5\" (UniqueName: \"kubernetes.io/projected/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-kube-api-access-q4wc5\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.181658 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.181667 4781 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-scripts\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.181677 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.619376 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fq7lx" event={"ID":"5d4f9f6a-d72d-41c6-92d1-7b90bdd35123","Type":"ContainerDied","Data":"9303c5823e63c6f97c50d628f3d352fb2122c373ad5ce93d67b1a5610d420d67"} Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.619417 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9303c5823e63c6f97c50d628f3d352fb2122c373ad5ce93d67b1a5610d420d67" Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.619422 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fq7lx" Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.822098 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.822712 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" containerName="nova-api-log" containerID="cri-o://cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e" gracePeriod=30 Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.823162 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" containerName="nova-api-api" containerID="cri-o://6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294" gracePeriod=30 Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.833894 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.834106 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e3331e6e-3384-4cc5-af5a-c24583a6865b" containerName="nova-scheduler-scheduler" containerID="cri-o://7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8" gracePeriod=30 Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.855808 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.856082 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="895bd81e-c873-46ce-99cd-526084be1061" containerName="nova-metadata-log" containerID="cri-o://44083416c5d1354bc46a5a47c270e65e34e16d44bda08f0843179121a6f66869" gracePeriod=30 Dec 08 20:26:32 crc kubenswrapper[4781]: I1208 20:26:32.856159 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="895bd81e-c873-46ce-99cd-526084be1061" containerName="nova-metadata-metadata" containerID="cri-o://e41071751f26fa57feb64e4393391d3ad389d07bc8b90a5570edc8a0d3faf1e2" gracePeriod=30 Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.383458 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.501809 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-public-tls-certs\") pod \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.501905 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-internal-tls-certs\") pod \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.502067 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-combined-ca-bundle\") pod \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.502119 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-config-data\") pod \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.502141 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rn99\" (UniqueName: \"kubernetes.io/projected/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-kube-api-access-2rn99\") pod \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.502256 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-logs\") pod \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\" (UID: \"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7\") " Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.502728 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-logs" (OuterVolumeSpecName: "logs") pod "e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" (UID: "e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.507199 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-kube-api-access-2rn99" (OuterVolumeSpecName: "kube-api-access-2rn99") pod "e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" (UID: "e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7"). InnerVolumeSpecName "kube-api-access-2rn99". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.529909 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" (UID: "e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.531789 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-config-data" (OuterVolumeSpecName: "config-data") pod "e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" (UID: "e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.556535 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" (UID: "e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.560779 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" (UID: "e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.604350 4781 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.604384 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.604394 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rn99\" (UniqueName: \"kubernetes.io/projected/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-kube-api-access-2rn99\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.604406 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.604418 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.604429 4781 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.629618 4781 generic.go:334] "Generic (PLEG): container finished" podID="e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" containerID="6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294" exitCode=0 Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.629650 4781 generic.go:334] "Generic (PLEG): container finished" podID="e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" containerID="cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e" exitCode=143 Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.629671 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.629705 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7","Type":"ContainerDied","Data":"6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294"} Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.629761 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7","Type":"ContainerDied","Data":"cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e"} Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.629774 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7","Type":"ContainerDied","Data":"e52237e7f261d482b720e49a8a60500fc414462bea205e1252bfa22baf9f08a7"} Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.629793 4781 scope.go:117] "RemoveContainer" containerID="6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.632540 4781 generic.go:334] "Generic (PLEG): container finished" podID="895bd81e-c873-46ce-99cd-526084be1061" containerID="44083416c5d1354bc46a5a47c270e65e34e16d44bda08f0843179121a6f66869" exitCode=143 Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.632575 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"895bd81e-c873-46ce-99cd-526084be1061","Type":"ContainerDied","Data":"44083416c5d1354bc46a5a47c270e65e34e16d44bda08f0843179121a6f66869"} Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.652517 4781 scope.go:117] "RemoveContainer" containerID="cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.683099 4781 scope.go:117] "RemoveContainer" containerID="6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294" Dec 08 20:26:33 crc kubenswrapper[4781]: E1208 20:26:33.684172 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294\": container with ID starting with 6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294 not found: ID does not exist" containerID="6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.684249 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294"} err="failed to get container status \"6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294\": rpc error: code = NotFound desc = could not find container \"6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294\": container with ID starting with 6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294 not found: ID does not exist" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.684282 4781 scope.go:117] "RemoveContainer" containerID="cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e" Dec 08 20:26:33 crc kubenswrapper[4781]: E1208 20:26:33.684682 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e\": container with ID starting with cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e not found: ID does not exist" containerID="cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.684709 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e"} err="failed to get container status \"cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e\": rpc error: code = NotFound desc = could not find container \"cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e\": container with ID starting with cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e not found: ID does not exist" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.684727 4781 scope.go:117] "RemoveContainer" containerID="6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.684974 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294"} err="failed to get container status \"6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294\": rpc error: code = NotFound desc = could not find container \"6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294\": container with ID starting with 6eacad5dc3c582636c20386b59247a351e3a05b0b8b5361d10c2a8ff75078294 not found: ID does not exist" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.685003 4781 scope.go:117] "RemoveContainer" containerID="cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.686325 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e"} err="failed to get container status \"cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e\": rpc error: code = NotFound desc = could not find container \"cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e\": container with ID starting with cf28a9a60519ad1ba5ab6549aa9651b054b646d7792f78d634b393900dd6b67e not found: ID does not exist" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.691687 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.702841 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.715260 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 08 20:26:33 crc kubenswrapper[4781]: E1208 20:26:33.715750 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4f9f6a-d72d-41c6-92d1-7b90bdd35123" containerName="nova-manage" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.715774 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4f9f6a-d72d-41c6-92d1-7b90bdd35123" containerName="nova-manage" Dec 08 20:26:33 crc kubenswrapper[4781]: E1208 20:26:33.715799 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c523b0-35de-4752-88b4-3493cf0502b4" containerName="dnsmasq-dns" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.715838 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c523b0-35de-4752-88b4-3493cf0502b4" containerName="dnsmasq-dns" Dec 08 20:26:33 crc kubenswrapper[4781]: E1208 20:26:33.715855 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" containerName="nova-api-api" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.715863 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" containerName="nova-api-api" Dec 08 20:26:33 crc kubenswrapper[4781]: E1208 20:26:33.715878 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c523b0-35de-4752-88b4-3493cf0502b4" containerName="init" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.715886 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c523b0-35de-4752-88b4-3493cf0502b4" containerName="init" Dec 08 20:26:33 crc kubenswrapper[4781]: E1208 20:26:33.715905 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" containerName="nova-api-log" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.715992 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" containerName="nova-api-log" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.716245 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" containerName="nova-api-log" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.716432 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" containerName="nova-api-api" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.716460 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4f9f6a-d72d-41c6-92d1-7b90bdd35123" containerName="nova-manage" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.716473 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="89c523b0-35de-4752-88b4-3493cf0502b4" containerName="dnsmasq-dns" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.717883 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.721051 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.721054 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.721331 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.725580 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.808576 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb9sp\" (UniqueName: \"kubernetes.io/projected/cba17aa8-5de4-4747-bc12-50b1d1b66490-kube-api-access-cb9sp\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.808699 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba17aa8-5de4-4747-bc12-50b1d1b66490-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.808727 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba17aa8-5de4-4747-bc12-50b1d1b66490-logs\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.808750 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba17aa8-5de4-4747-bc12-50b1d1b66490-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.808780 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba17aa8-5de4-4747-bc12-50b1d1b66490-config-data\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.808809 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba17aa8-5de4-4747-bc12-50b1d1b66490-public-tls-certs\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.910637 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba17aa8-5de4-4747-bc12-50b1d1b66490-config-data\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.910678 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba17aa8-5de4-4747-bc12-50b1d1b66490-public-tls-certs\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.910784 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb9sp\" (UniqueName: \"kubernetes.io/projected/cba17aa8-5de4-4747-bc12-50b1d1b66490-kube-api-access-cb9sp\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.910824 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba17aa8-5de4-4747-bc12-50b1d1b66490-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.910842 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba17aa8-5de4-4747-bc12-50b1d1b66490-logs\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.910857 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba17aa8-5de4-4747-bc12-50b1d1b66490-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.911461 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba17aa8-5de4-4747-bc12-50b1d1b66490-logs\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.914386 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba17aa8-5de4-4747-bc12-50b1d1b66490-public-tls-certs\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.914446 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba17aa8-5de4-4747-bc12-50b1d1b66490-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.914477 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba17aa8-5de4-4747-bc12-50b1d1b66490-config-data\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.916218 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cba17aa8-5de4-4747-bc12-50b1d1b66490-internal-tls-certs\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:33 crc kubenswrapper[4781]: I1208 20:26:33.927651 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb9sp\" (UniqueName: \"kubernetes.io/projected/cba17aa8-5de4-4747-bc12-50b1d1b66490-kube-api-access-cb9sp\") pod \"nova-api-0\" (UID: \"cba17aa8-5de4-4747-bc12-50b1d1b66490\") " pod="openstack/nova-api-0" Dec 08 20:26:34 crc kubenswrapper[4781]: I1208 20:26:34.039936 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 08 20:26:34 crc kubenswrapper[4781]: I1208 20:26:34.147849 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7" path="/var/lib/kubelet/pods/e2c5a9fa-2538-4fc6-8c09-05bc291d6fa7/volumes" Dec 08 20:26:34 crc kubenswrapper[4781]: E1208 20:26:34.403793 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 08 20:26:34 crc kubenswrapper[4781]: E1208 20:26:34.405704 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 08 20:26:34 crc kubenswrapper[4781]: E1208 20:26:34.407057 4781 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 08 20:26:34 crc kubenswrapper[4781]: E1208 20:26:34.407099 4781 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e3331e6e-3384-4cc5-af5a-c24583a6865b" containerName="nova-scheduler-scheduler" Dec 08 20:26:34 crc kubenswrapper[4781]: I1208 20:26:34.470875 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 08 20:26:34 crc kubenswrapper[4781]: I1208 20:26:34.643125 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cba17aa8-5de4-4747-bc12-50b1d1b66490","Type":"ContainerStarted","Data":"d9fb5f203b2a6d6030643ad5b917962514927fc7b92e52967c6f33f76ee35e1f"} Dec 08 20:26:34 crc kubenswrapper[4781]: I1208 20:26:34.643479 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cba17aa8-5de4-4747-bc12-50b1d1b66490","Type":"ContainerStarted","Data":"efebe9fbceb6fd0d852cc390d3c0524e5062c8d71efca9b81f6549b1a3a20943"} Dec 08 20:26:35 crc kubenswrapper[4781]: I1208 20:26:35.660512 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cba17aa8-5de4-4747-bc12-50b1d1b66490","Type":"ContainerStarted","Data":"118460d951283a957ea7be2de2a8ce67cbb4bb7e4a98c310d16d09377d07282b"} Dec 08 20:26:35 crc kubenswrapper[4781]: I1208 20:26:35.699148 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.699118925 podStartE2EDuration="2.699118925s" podCreationTimestamp="2025-12-08 20:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:26:35.692194696 +0000 UTC m=+1311.843478073" watchObservedRunningTime="2025-12-08 20:26:35.699118925 +0000 UTC m=+1311.850402342" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.000539 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="895bd81e-c873-46ce-99cd-526084be1061" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:47782->10.217.0.192:8775: read: connection reset by peer" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.000554 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="895bd81e-c873-46ce-99cd-526084be1061" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.192:8775/\": read tcp 10.217.0.2:47774->10.217.0.192:8775: read: connection reset by peer" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.424676 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.566562 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-combined-ca-bundle\") pod \"895bd81e-c873-46ce-99cd-526084be1061\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.566615 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-config-data\") pod \"895bd81e-c873-46ce-99cd-526084be1061\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.566803 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knh55\" (UniqueName: \"kubernetes.io/projected/895bd81e-c873-46ce-99cd-526084be1061-kube-api-access-knh55\") pod \"895bd81e-c873-46ce-99cd-526084be1061\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.567400 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/895bd81e-c873-46ce-99cd-526084be1061-logs" (OuterVolumeSpecName: "logs") pod "895bd81e-c873-46ce-99cd-526084be1061" (UID: "895bd81e-c873-46ce-99cd-526084be1061"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.567551 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/895bd81e-c873-46ce-99cd-526084be1061-logs\") pod \"895bd81e-c873-46ce-99cd-526084be1061\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.567676 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-nova-metadata-tls-certs\") pod \"895bd81e-c873-46ce-99cd-526084be1061\" (UID: \"895bd81e-c873-46ce-99cd-526084be1061\") " Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.568474 4781 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/895bd81e-c873-46ce-99cd-526084be1061-logs\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.572453 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/895bd81e-c873-46ce-99cd-526084be1061-kube-api-access-knh55" (OuterVolumeSpecName: "kube-api-access-knh55") pod "895bd81e-c873-46ce-99cd-526084be1061" (UID: "895bd81e-c873-46ce-99cd-526084be1061"). InnerVolumeSpecName "kube-api-access-knh55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.597010 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "895bd81e-c873-46ce-99cd-526084be1061" (UID: "895bd81e-c873-46ce-99cd-526084be1061"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.599273 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-config-data" (OuterVolumeSpecName: "config-data") pod "895bd81e-c873-46ce-99cd-526084be1061" (UID: "895bd81e-c873-46ce-99cd-526084be1061"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.627169 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "895bd81e-c873-46ce-99cd-526084be1061" (UID: "895bd81e-c873-46ce-99cd-526084be1061"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.672037 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knh55\" (UniqueName: \"kubernetes.io/projected/895bd81e-c873-46ce-99cd-526084be1061-kube-api-access-knh55\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.672068 4781 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.672077 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.672086 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/895bd81e-c873-46ce-99cd-526084be1061-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.674108 4781 generic.go:334] "Generic (PLEG): container finished" podID="895bd81e-c873-46ce-99cd-526084be1061" containerID="e41071751f26fa57feb64e4393391d3ad389d07bc8b90a5570edc8a0d3faf1e2" exitCode=0 Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.674170 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.674371 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"895bd81e-c873-46ce-99cd-526084be1061","Type":"ContainerDied","Data":"e41071751f26fa57feb64e4393391d3ad389d07bc8b90a5570edc8a0d3faf1e2"} Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.674536 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"895bd81e-c873-46ce-99cd-526084be1061","Type":"ContainerDied","Data":"695c343718483ccfb2e522b88d307785555ce4278d10ddd8c6db24e339908229"} Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.674612 4781 scope.go:117] "RemoveContainer" containerID="e41071751f26fa57feb64e4393391d3ad389d07bc8b90a5570edc8a0d3faf1e2" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.705127 4781 scope.go:117] "RemoveContainer" containerID="44083416c5d1354bc46a5a47c270e65e34e16d44bda08f0843179121a6f66869" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.707619 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.724712 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.757674 4781 scope.go:117] "RemoveContainer" containerID="e41071751f26fa57feb64e4393391d3ad389d07bc8b90a5570edc8a0d3faf1e2" Dec 08 20:26:36 crc kubenswrapper[4781]: E1208 20:26:36.764534 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e41071751f26fa57feb64e4393391d3ad389d07bc8b90a5570edc8a0d3faf1e2\": container with ID starting with e41071751f26fa57feb64e4393391d3ad389d07bc8b90a5570edc8a0d3faf1e2 not found: ID does not exist" containerID="e41071751f26fa57feb64e4393391d3ad389d07bc8b90a5570edc8a0d3faf1e2" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.764585 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e41071751f26fa57feb64e4393391d3ad389d07bc8b90a5570edc8a0d3faf1e2"} err="failed to get container status \"e41071751f26fa57feb64e4393391d3ad389d07bc8b90a5570edc8a0d3faf1e2\": rpc error: code = NotFound desc = could not find container \"e41071751f26fa57feb64e4393391d3ad389d07bc8b90a5570edc8a0d3faf1e2\": container with ID starting with e41071751f26fa57feb64e4393391d3ad389d07bc8b90a5570edc8a0d3faf1e2 not found: ID does not exist" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.764615 4781 scope.go:117] "RemoveContainer" containerID="44083416c5d1354bc46a5a47c270e65e34e16d44bda08f0843179121a6f66869" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.764719 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:26:36 crc kubenswrapper[4781]: E1208 20:26:36.765119 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895bd81e-c873-46ce-99cd-526084be1061" containerName="nova-metadata-metadata" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.765139 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="895bd81e-c873-46ce-99cd-526084be1061" containerName="nova-metadata-metadata" Dec 08 20:26:36 crc kubenswrapper[4781]: E1208 20:26:36.765170 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895bd81e-c873-46ce-99cd-526084be1061" containerName="nova-metadata-log" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.765177 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="895bd81e-c873-46ce-99cd-526084be1061" containerName="nova-metadata-log" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.765342 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="895bd81e-c873-46ce-99cd-526084be1061" containerName="nova-metadata-log" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.765364 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="895bd81e-c873-46ce-99cd-526084be1061" containerName="nova-metadata-metadata" Dec 08 20:26:36 crc kubenswrapper[4781]: E1208 20:26:36.766130 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44083416c5d1354bc46a5a47c270e65e34e16d44bda08f0843179121a6f66869\": container with ID starting with 44083416c5d1354bc46a5a47c270e65e34e16d44bda08f0843179121a6f66869 not found: ID does not exist" containerID="44083416c5d1354bc46a5a47c270e65e34e16d44bda08f0843179121a6f66869" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.766167 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44083416c5d1354bc46a5a47c270e65e34e16d44bda08f0843179121a6f66869"} err="failed to get container status \"44083416c5d1354bc46a5a47c270e65e34e16d44bda08f0843179121a6f66869\": rpc error: code = NotFound desc = could not find container \"44083416c5d1354bc46a5a47c270e65e34e16d44bda08f0843179121a6f66869\": container with ID starting with 44083416c5d1354bc46a5a47c270e65e34e16d44bda08f0843179121a6f66869 not found: ID does not exist" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.766379 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.774970 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.776062 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.785904 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.889349 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2348d949-d7f2-44a5-8e47-58b358d060c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.889396 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vcpb\" (UniqueName: \"kubernetes.io/projected/2348d949-d7f2-44a5-8e47-58b358d060c8-kube-api-access-7vcpb\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.889436 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2348d949-d7f2-44a5-8e47-58b358d060c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.889470 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2348d949-d7f2-44a5-8e47-58b358d060c8-config-data\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.889499 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2348d949-d7f2-44a5-8e47-58b358d060c8-logs\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.991364 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2348d949-d7f2-44a5-8e47-58b358d060c8-config-data\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.991640 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2348d949-d7f2-44a5-8e47-58b358d060c8-logs\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.991761 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2348d949-d7f2-44a5-8e47-58b358d060c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.991785 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vcpb\" (UniqueName: \"kubernetes.io/projected/2348d949-d7f2-44a5-8e47-58b358d060c8-kube-api-access-7vcpb\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.991818 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2348d949-d7f2-44a5-8e47-58b358d060c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.992527 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2348d949-d7f2-44a5-8e47-58b358d060c8-logs\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.996289 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2348d949-d7f2-44a5-8e47-58b358d060c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.996717 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2348d949-d7f2-44a5-8e47-58b358d060c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:36 crc kubenswrapper[4781]: I1208 20:26:36.998510 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2348d949-d7f2-44a5-8e47-58b358d060c8-config-data\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:37 crc kubenswrapper[4781]: I1208 20:26:37.007911 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vcpb\" (UniqueName: \"kubernetes.io/projected/2348d949-d7f2-44a5-8e47-58b358d060c8-kube-api-access-7vcpb\") pod \"nova-metadata-0\" (UID: \"2348d949-d7f2-44a5-8e47-58b358d060c8\") " pod="openstack/nova-metadata-0" Dec 08 20:26:37 crc kubenswrapper[4781]: I1208 20:26:37.100759 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 08 20:26:37 crc kubenswrapper[4781]: I1208 20:26:37.546459 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 08 20:26:37 crc kubenswrapper[4781]: W1208 20:26:37.553231 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2348d949_d7f2_44a5_8e47_58b358d060c8.slice/crio-b5b1031be94a86fb4fbca0b2d3a51b6fd3b57246d9bc7e20539a660cc5a14d0d WatchSource:0}: Error finding container b5b1031be94a86fb4fbca0b2d3a51b6fd3b57246d9bc7e20539a660cc5a14d0d: Status 404 returned error can't find the container with id b5b1031be94a86fb4fbca0b2d3a51b6fd3b57246d9bc7e20539a660cc5a14d0d Dec 08 20:26:37 crc kubenswrapper[4781]: I1208 20:26:37.683512 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2348d949-d7f2-44a5-8e47-58b358d060c8","Type":"ContainerStarted","Data":"b5b1031be94a86fb4fbca0b2d3a51b6fd3b57246d9bc7e20539a660cc5a14d0d"} Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.147123 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="895bd81e-c873-46ce-99cd-526084be1061" path="/var/lib/kubelet/pods/895bd81e-c873-46ce-99cd-526084be1061/volumes" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.386520 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.516353 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3331e6e-3384-4cc5-af5a-c24583a6865b-config-data\") pod \"e3331e6e-3384-4cc5-af5a-c24583a6865b\" (UID: \"e3331e6e-3384-4cc5-af5a-c24583a6865b\") " Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.516523 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmtfg\" (UniqueName: \"kubernetes.io/projected/e3331e6e-3384-4cc5-af5a-c24583a6865b-kube-api-access-bmtfg\") pod \"e3331e6e-3384-4cc5-af5a-c24583a6865b\" (UID: \"e3331e6e-3384-4cc5-af5a-c24583a6865b\") " Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.516702 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3331e6e-3384-4cc5-af5a-c24583a6865b-combined-ca-bundle\") pod \"e3331e6e-3384-4cc5-af5a-c24583a6865b\" (UID: \"e3331e6e-3384-4cc5-af5a-c24583a6865b\") " Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.522091 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3331e6e-3384-4cc5-af5a-c24583a6865b-kube-api-access-bmtfg" (OuterVolumeSpecName: "kube-api-access-bmtfg") pod "e3331e6e-3384-4cc5-af5a-c24583a6865b" (UID: "e3331e6e-3384-4cc5-af5a-c24583a6865b"). InnerVolumeSpecName "kube-api-access-bmtfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.546997 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3331e6e-3384-4cc5-af5a-c24583a6865b-config-data" (OuterVolumeSpecName: "config-data") pod "e3331e6e-3384-4cc5-af5a-c24583a6865b" (UID: "e3331e6e-3384-4cc5-af5a-c24583a6865b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.555437 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3331e6e-3384-4cc5-af5a-c24583a6865b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3331e6e-3384-4cc5-af5a-c24583a6865b" (UID: "e3331e6e-3384-4cc5-af5a-c24583a6865b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.619068 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmtfg\" (UniqueName: \"kubernetes.io/projected/e3331e6e-3384-4cc5-af5a-c24583a6865b-kube-api-access-bmtfg\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.619105 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3331e6e-3384-4cc5-af5a-c24583a6865b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.619117 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3331e6e-3384-4cc5-af5a-c24583a6865b-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.697516 4781 generic.go:334] "Generic (PLEG): container finished" podID="e3331e6e-3384-4cc5-af5a-c24583a6865b" containerID="7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8" exitCode=0 Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.697567 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.697588 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3331e6e-3384-4cc5-af5a-c24583a6865b","Type":"ContainerDied","Data":"7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8"} Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.697619 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e3331e6e-3384-4cc5-af5a-c24583a6865b","Type":"ContainerDied","Data":"9031ffd6ba7a2ee2365d0552f2c17e140f0aab1dde07f05a63e2ae5661475ea1"} Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.697646 4781 scope.go:117] "RemoveContainer" containerID="7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.704343 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2348d949-d7f2-44a5-8e47-58b358d060c8","Type":"ContainerStarted","Data":"673e8311260853d9d40fc73368d1ac1dc4a4e876649ab782bcfc34c1294ad9f1"} Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.704379 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2348d949-d7f2-44a5-8e47-58b358d060c8","Type":"ContainerStarted","Data":"248a23e0dbb3c0c76a6701ead5b68ea533f660ffb8230a732540435fc8ff2de7"} Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.730465 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.730448828 podStartE2EDuration="2.730448828s" podCreationTimestamp="2025-12-08 20:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:26:38.727176434 +0000 UTC m=+1314.878459811" watchObservedRunningTime="2025-12-08 20:26:38.730448828 +0000 UTC m=+1314.881732205" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.730951 4781 scope.go:117] "RemoveContainer" containerID="7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8" Dec 08 20:26:38 crc kubenswrapper[4781]: E1208 20:26:38.732543 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8\": container with ID starting with 7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8 not found: ID does not exist" containerID="7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.732577 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8"} err="failed to get container status \"7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8\": rpc error: code = NotFound desc = could not find container \"7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8\": container with ID starting with 7355e6e2a4c671c8cf9376224c9b504fb50d5301d738156e756531b007b16cd8 not found: ID does not exist" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.758149 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.775275 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.790048 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:26:38 crc kubenswrapper[4781]: E1208 20:26:38.790604 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3331e6e-3384-4cc5-af5a-c24583a6865b" containerName="nova-scheduler-scheduler" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.790628 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3331e6e-3384-4cc5-af5a-c24583a6865b" containerName="nova-scheduler-scheduler" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.790869 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3331e6e-3384-4cc5-af5a-c24583a6865b" containerName="nova-scheduler-scheduler" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.791656 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.794174 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.803431 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.924194 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjcw2\" (UniqueName: \"kubernetes.io/projected/daf2aa43-ddc7-4618-a3df-665b947b68bd-kube-api-access-vjcw2\") pod \"nova-scheduler-0\" (UID: \"daf2aa43-ddc7-4618-a3df-665b947b68bd\") " pod="openstack/nova-scheduler-0" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.924303 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daf2aa43-ddc7-4618-a3df-665b947b68bd-config-data\") pod \"nova-scheduler-0\" (UID: \"daf2aa43-ddc7-4618-a3df-665b947b68bd\") " pod="openstack/nova-scheduler-0" Dec 08 20:26:38 crc kubenswrapper[4781]: I1208 20:26:38.924566 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daf2aa43-ddc7-4618-a3df-665b947b68bd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"daf2aa43-ddc7-4618-a3df-665b947b68bd\") " pod="openstack/nova-scheduler-0" Dec 08 20:26:39 crc kubenswrapper[4781]: I1208 20:26:39.027554 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjcw2\" (UniqueName: \"kubernetes.io/projected/daf2aa43-ddc7-4618-a3df-665b947b68bd-kube-api-access-vjcw2\") pod \"nova-scheduler-0\" (UID: \"daf2aa43-ddc7-4618-a3df-665b947b68bd\") " pod="openstack/nova-scheduler-0" Dec 08 20:26:39 crc kubenswrapper[4781]: I1208 20:26:39.027636 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daf2aa43-ddc7-4618-a3df-665b947b68bd-config-data\") pod \"nova-scheduler-0\" (UID: \"daf2aa43-ddc7-4618-a3df-665b947b68bd\") " pod="openstack/nova-scheduler-0" Dec 08 20:26:39 crc kubenswrapper[4781]: I1208 20:26:39.027698 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daf2aa43-ddc7-4618-a3df-665b947b68bd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"daf2aa43-ddc7-4618-a3df-665b947b68bd\") " pod="openstack/nova-scheduler-0" Dec 08 20:26:39 crc kubenswrapper[4781]: I1208 20:26:39.032303 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/daf2aa43-ddc7-4618-a3df-665b947b68bd-config-data\") pod \"nova-scheduler-0\" (UID: \"daf2aa43-ddc7-4618-a3df-665b947b68bd\") " pod="openstack/nova-scheduler-0" Dec 08 20:26:39 crc kubenswrapper[4781]: I1208 20:26:39.032344 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daf2aa43-ddc7-4618-a3df-665b947b68bd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"daf2aa43-ddc7-4618-a3df-665b947b68bd\") " pod="openstack/nova-scheduler-0" Dec 08 20:26:39 crc kubenswrapper[4781]: I1208 20:26:39.048195 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjcw2\" (UniqueName: \"kubernetes.io/projected/daf2aa43-ddc7-4618-a3df-665b947b68bd-kube-api-access-vjcw2\") pod \"nova-scheduler-0\" (UID: \"daf2aa43-ddc7-4618-a3df-665b947b68bd\") " pod="openstack/nova-scheduler-0" Dec 08 20:26:39 crc kubenswrapper[4781]: I1208 20:26:39.110541 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 08 20:26:39 crc kubenswrapper[4781]: I1208 20:26:39.560318 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 08 20:26:39 crc kubenswrapper[4781]: I1208 20:26:39.722960 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"daf2aa43-ddc7-4618-a3df-665b947b68bd","Type":"ContainerStarted","Data":"a6d6d727bb66d92d6dcfffc92f1061489f04fd8257ad6d3953cbc88e4c806bce"} Dec 08 20:26:40 crc kubenswrapper[4781]: I1208 20:26:40.138526 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3331e6e-3384-4cc5-af5a-c24583a6865b" path="/var/lib/kubelet/pods/e3331e6e-3384-4cc5-af5a-c24583a6865b/volumes" Dec 08 20:26:40 crc kubenswrapper[4781]: I1208 20:26:40.730547 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"daf2aa43-ddc7-4618-a3df-665b947b68bd","Type":"ContainerStarted","Data":"6bc4a833bacd84bda7b5298982f2ac657d25d93600d1bf9eebc40d3503ba4f7d"} Dec 08 20:26:40 crc kubenswrapper[4781]: I1208 20:26:40.746252 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.746232441 podStartE2EDuration="2.746232441s" podCreationTimestamp="2025-12-08 20:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:26:40.742376301 +0000 UTC m=+1316.893659688" watchObservedRunningTime="2025-12-08 20:26:40.746232441 +0000 UTC m=+1316.897515818" Dec 08 20:26:42 crc kubenswrapper[4781]: I1208 20:26:42.101131 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 08 20:26:42 crc kubenswrapper[4781]: I1208 20:26:42.101490 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 08 20:26:44 crc kubenswrapper[4781]: I1208 20:26:44.040456 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 08 20:26:44 crc kubenswrapper[4781]: I1208 20:26:44.041059 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 08 20:26:44 crc kubenswrapper[4781]: I1208 20:26:44.111143 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 08 20:26:45 crc kubenswrapper[4781]: I1208 20:26:45.087115 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cba17aa8-5de4-4747-bc12-50b1d1b66490" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 08 20:26:45 crc kubenswrapper[4781]: I1208 20:26:45.087193 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cba17aa8-5de4-4747-bc12-50b1d1b66490" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 08 20:26:47 crc kubenswrapper[4781]: I1208 20:26:47.101838 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 08 20:26:47 crc kubenswrapper[4781]: I1208 20:26:47.102205 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 08 20:26:48 crc kubenswrapper[4781]: I1208 20:26:48.117099 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2348d949-d7f2-44a5-8e47-58b358d060c8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 08 20:26:48 crc kubenswrapper[4781]: I1208 20:26:48.117129 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2348d949-d7f2-44a5-8e47-58b358d060c8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 08 20:26:49 crc kubenswrapper[4781]: I1208 20:26:49.111781 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 08 20:26:49 crc kubenswrapper[4781]: I1208 20:26:49.146652 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 08 20:26:49 crc kubenswrapper[4781]: I1208 20:26:49.872038 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 08 20:26:51 crc kubenswrapper[4781]: I1208 20:26:51.130711 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 08 20:26:54 crc kubenswrapper[4781]: I1208 20:26:54.048223 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 08 20:26:54 crc kubenswrapper[4781]: I1208 20:26:54.049282 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 08 20:26:54 crc kubenswrapper[4781]: I1208 20:26:54.050473 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 08 20:26:54 crc kubenswrapper[4781]: I1208 20:26:54.056870 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 08 20:26:54 crc kubenswrapper[4781]: I1208 20:26:54.882457 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 08 20:26:54 crc kubenswrapper[4781]: I1208 20:26:54.891758 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 08 20:26:57 crc kubenswrapper[4781]: I1208 20:26:57.108220 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 08 20:26:57 crc kubenswrapper[4781]: I1208 20:26:57.114197 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 08 20:26:57 crc kubenswrapper[4781]: I1208 20:26:57.114807 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 08 20:26:57 crc kubenswrapper[4781]: I1208 20:26:57.915104 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 08 20:27:05 crc kubenswrapper[4781]: I1208 20:27:05.993940 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 20:27:06 crc kubenswrapper[4781]: I1208 20:27:06.898365 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 20:27:10 crc kubenswrapper[4781]: I1208 20:27:10.083470 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="9192ae66-92ec-4618-aecd-3ec306da8525" containerName="rabbitmq" containerID="cri-o://13e7d24de8bed466a6fda859022625c9097117b1843aeba955fd03bd8c31fbb7" gracePeriod=604796 Dec 08 20:27:10 crc kubenswrapper[4781]: I1208 20:27:10.973795 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="202f9454-de0b-4a09-abb6-dacbea9b5fa4" containerName="rabbitmq" containerID="cri-o://0ac5f9ab1ce79b914a89ce8ace956b426166c3bc9dcb385340f4505993f3cf19" gracePeriod=604796 Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.079612 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="9192ae66-92ec-4618-aecd-3ec306da8525" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.648383 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="202f9454-de0b-4a09-abb6-dacbea9b5fa4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.787254 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.952834 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9192ae66-92ec-4618-aecd-3ec306da8525-pod-info\") pod \"9192ae66-92ec-4618-aecd-3ec306da8525\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.952937 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-confd\") pod \"9192ae66-92ec-4618-aecd-3ec306da8525\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.952966 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-tls\") pod \"9192ae66-92ec-4618-aecd-3ec306da8525\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.953103 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9192ae66-92ec-4618-aecd-3ec306da8525-erlang-cookie-secret\") pod \"9192ae66-92ec-4618-aecd-3ec306da8525\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.953132 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-plugins-conf\") pod \"9192ae66-92ec-4618-aecd-3ec306da8525\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.953193 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgpxx\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-kube-api-access-dgpxx\") pod \"9192ae66-92ec-4618-aecd-3ec306da8525\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.953249 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-erlang-cookie\") pod \"9192ae66-92ec-4618-aecd-3ec306da8525\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.953295 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-plugins\") pod \"9192ae66-92ec-4618-aecd-3ec306da8525\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.953320 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9192ae66-92ec-4618-aecd-3ec306da8525\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.953344 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-config-data\") pod \"9192ae66-92ec-4618-aecd-3ec306da8525\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.953374 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-server-conf\") pod \"9192ae66-92ec-4618-aecd-3ec306da8525\" (UID: \"9192ae66-92ec-4618-aecd-3ec306da8525\") " Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.953932 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9192ae66-92ec-4618-aecd-3ec306da8525" (UID: "9192ae66-92ec-4618-aecd-3ec306da8525"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.954609 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9192ae66-92ec-4618-aecd-3ec306da8525" (UID: "9192ae66-92ec-4618-aecd-3ec306da8525"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.954718 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9192ae66-92ec-4618-aecd-3ec306da8525" (UID: "9192ae66-92ec-4618-aecd-3ec306da8525"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.962089 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9192ae66-92ec-4618-aecd-3ec306da8525" (UID: "9192ae66-92ec-4618-aecd-3ec306da8525"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.962314 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "9192ae66-92ec-4618-aecd-3ec306da8525" (UID: "9192ae66-92ec-4618-aecd-3ec306da8525"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.972324 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9192ae66-92ec-4618-aecd-3ec306da8525-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9192ae66-92ec-4618-aecd-3ec306da8525" (UID: "9192ae66-92ec-4618-aecd-3ec306da8525"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.973606 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9192ae66-92ec-4618-aecd-3ec306da8525-pod-info" (OuterVolumeSpecName: "pod-info") pod "9192ae66-92ec-4618-aecd-3ec306da8525" (UID: "9192ae66-92ec-4618-aecd-3ec306da8525"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 08 20:27:16 crc kubenswrapper[4781]: I1208 20:27:16.975037 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-kube-api-access-dgpxx" (OuterVolumeSpecName: "kube-api-access-dgpxx") pod "9192ae66-92ec-4618-aecd-3ec306da8525" (UID: "9192ae66-92ec-4618-aecd-3ec306da8525"). InnerVolumeSpecName "kube-api-access-dgpxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.041244 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-server-conf" (OuterVolumeSpecName: "server-conf") pod "9192ae66-92ec-4618-aecd-3ec306da8525" (UID: "9192ae66-92ec-4618-aecd-3ec306da8525"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.044029 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-config-data" (OuterVolumeSpecName: "config-data") pod "9192ae66-92ec-4618-aecd-3ec306da8525" (UID: "9192ae66-92ec-4618-aecd-3ec306da8525"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.055624 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.055727 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.055741 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.055752 4781 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-server-conf\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.055762 4781 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9192ae66-92ec-4618-aecd-3ec306da8525-pod-info\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.055777 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.055789 4781 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9192ae66-92ec-4618-aecd-3ec306da8525-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.055801 4781 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9192ae66-92ec-4618-aecd-3ec306da8525-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.055812 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgpxx\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-kube-api-access-dgpxx\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.055825 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.077650 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.095852 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9192ae66-92ec-4618-aecd-3ec306da8525" (UID: "9192ae66-92ec-4618-aecd-3ec306da8525"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.139780 4781 generic.go:334] "Generic (PLEG): container finished" podID="9192ae66-92ec-4618-aecd-3ec306da8525" containerID="13e7d24de8bed466a6fda859022625c9097117b1843aeba955fd03bd8c31fbb7" exitCode=0 Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.139834 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9192ae66-92ec-4618-aecd-3ec306da8525","Type":"ContainerDied","Data":"13e7d24de8bed466a6fda859022625c9097117b1843aeba955fd03bd8c31fbb7"} Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.139867 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"9192ae66-92ec-4618-aecd-3ec306da8525","Type":"ContainerDied","Data":"82841c65ce8bbb78824ed1dfc40a0711e453dad420f4369c2215c046535bebaa"} Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.139890 4781 scope.go:117] "RemoveContainer" containerID="13e7d24de8bed466a6fda859022625c9097117b1843aeba955fd03bd8c31fbb7" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.140093 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.158229 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9192ae66-92ec-4618-aecd-3ec306da8525-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.158256 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.177689 4781 scope.go:117] "RemoveContainer" containerID="53cd4c4a695cb9ad9bb144e86dd7dccccf8ed41ca0427fc0f9f19b84cf5df558" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.184438 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.196516 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.211371 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 20:27:17 crc kubenswrapper[4781]: E1208 20:27:17.211732 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9192ae66-92ec-4618-aecd-3ec306da8525" containerName="rabbitmq" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.211748 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9192ae66-92ec-4618-aecd-3ec306da8525" containerName="rabbitmq" Dec 08 20:27:17 crc kubenswrapper[4781]: E1208 20:27:17.211790 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9192ae66-92ec-4618-aecd-3ec306da8525" containerName="setup-container" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.211797 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9192ae66-92ec-4618-aecd-3ec306da8525" containerName="setup-container" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.211972 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9192ae66-92ec-4618-aecd-3ec306da8525" containerName="rabbitmq" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.212947 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.226429 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.226476 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.226613 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.226816 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.227067 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mftgd" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.227418 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.227599 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.256005 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.267356 4781 scope.go:117] "RemoveContainer" containerID="13e7d24de8bed466a6fda859022625c9097117b1843aeba955fd03bd8c31fbb7" Dec 08 20:27:17 crc kubenswrapper[4781]: E1208 20:27:17.270367 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13e7d24de8bed466a6fda859022625c9097117b1843aeba955fd03bd8c31fbb7\": container with ID starting with 13e7d24de8bed466a6fda859022625c9097117b1843aeba955fd03bd8c31fbb7 not found: ID does not exist" containerID="13e7d24de8bed466a6fda859022625c9097117b1843aeba955fd03bd8c31fbb7" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.270416 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13e7d24de8bed466a6fda859022625c9097117b1843aeba955fd03bd8c31fbb7"} err="failed to get container status \"13e7d24de8bed466a6fda859022625c9097117b1843aeba955fd03bd8c31fbb7\": rpc error: code = NotFound desc = could not find container \"13e7d24de8bed466a6fda859022625c9097117b1843aeba955fd03bd8c31fbb7\": container with ID starting with 13e7d24de8bed466a6fda859022625c9097117b1843aeba955fd03bd8c31fbb7 not found: ID does not exist" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.270442 4781 scope.go:117] "RemoveContainer" containerID="53cd4c4a695cb9ad9bb144e86dd7dccccf8ed41ca0427fc0f9f19b84cf5df558" Dec 08 20:27:17 crc kubenswrapper[4781]: E1208 20:27:17.280163 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53cd4c4a695cb9ad9bb144e86dd7dccccf8ed41ca0427fc0f9f19b84cf5df558\": container with ID starting with 53cd4c4a695cb9ad9bb144e86dd7dccccf8ed41ca0427fc0f9f19b84cf5df558 not found: ID does not exist" containerID="53cd4c4a695cb9ad9bb144e86dd7dccccf8ed41ca0427fc0f9f19b84cf5df558" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.280245 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53cd4c4a695cb9ad9bb144e86dd7dccccf8ed41ca0427fc0f9f19b84cf5df558"} err="failed to get container status \"53cd4c4a695cb9ad9bb144e86dd7dccccf8ed41ca0427fc0f9f19b84cf5df558\": rpc error: code = NotFound desc = could not find container \"53cd4c4a695cb9ad9bb144e86dd7dccccf8ed41ca0427fc0f9f19b84cf5df558\": container with ID starting with 53cd4c4a695cb9ad9bb144e86dd7dccccf8ed41ca0427fc0f9f19b84cf5df558 not found: ID does not exist" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.375308 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20fc56e1-a1f6-4495-834a-41bfebf14aef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.375367 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20fc56e1-a1f6-4495-834a-41bfebf14aef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.375404 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20fc56e1-a1f6-4495-834a-41bfebf14aef-config-data\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.375421 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20fc56e1-a1f6-4495-834a-41bfebf14aef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.375530 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20fc56e1-a1f6-4495-834a-41bfebf14aef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.375569 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.375614 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2c28\" (UniqueName: \"kubernetes.io/projected/20fc56e1-a1f6-4495-834a-41bfebf14aef-kube-api-access-h2c28\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.375655 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20fc56e1-a1f6-4495-834a-41bfebf14aef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.375689 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20fc56e1-a1f6-4495-834a-41bfebf14aef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.375756 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20fc56e1-a1f6-4495-834a-41bfebf14aef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.375778 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20fc56e1-a1f6-4495-834a-41bfebf14aef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.477604 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20fc56e1-a1f6-4495-834a-41bfebf14aef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.477650 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20fc56e1-a1f6-4495-834a-41bfebf14aef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.477677 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20fc56e1-a1f6-4495-834a-41bfebf14aef-config-data\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.477690 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20fc56e1-a1f6-4495-834a-41bfebf14aef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.477773 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20fc56e1-a1f6-4495-834a-41bfebf14aef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.477827 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.477863 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2c28\" (UniqueName: \"kubernetes.io/projected/20fc56e1-a1f6-4495-834a-41bfebf14aef-kube-api-access-h2c28\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.477890 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20fc56e1-a1f6-4495-834a-41bfebf14aef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.477927 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20fc56e1-a1f6-4495-834a-41bfebf14aef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.477958 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20fc56e1-a1f6-4495-834a-41bfebf14aef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.477972 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20fc56e1-a1f6-4495-834a-41bfebf14aef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.478798 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.479439 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20fc56e1-a1f6-4495-834a-41bfebf14aef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.478802 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20fc56e1-a1f6-4495-834a-41bfebf14aef-config-data\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.479903 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20fc56e1-a1f6-4495-834a-41bfebf14aef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.480042 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20fc56e1-a1f6-4495-834a-41bfebf14aef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.480498 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20fc56e1-a1f6-4495-834a-41bfebf14aef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.488349 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20fc56e1-a1f6-4495-834a-41bfebf14aef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.489491 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20fc56e1-a1f6-4495-834a-41bfebf14aef-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.494025 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20fc56e1-a1f6-4495-834a-41bfebf14aef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.501089 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2c28\" (UniqueName: \"kubernetes.io/projected/20fc56e1-a1f6-4495-834a-41bfebf14aef-kube-api-access-h2c28\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.505548 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20fc56e1-a1f6-4495-834a-41bfebf14aef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.516238 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"20fc56e1-a1f6-4495-834a-41bfebf14aef\") " pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.554733 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.697657 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.888712 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-plugins\") pod \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.889046 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-confd\") pod \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.889083 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-config-data\") pod \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.889117 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-erlang-cookie\") pod \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.889165 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-plugins-conf\") pod \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.889314 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/202f9454-de0b-4a09-abb6-dacbea9b5fa4-erlang-cookie-secret\") pod \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.889406 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/202f9454-de0b-4a09-abb6-dacbea9b5fa4-pod-info\") pod \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.889427 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-server-conf\") pod \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.889453 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-tls\") pod \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.889497 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxnll\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-kube-api-access-hxnll\") pod \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.889547 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\" (UID: \"202f9454-de0b-4a09-abb6-dacbea9b5fa4\") " Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.889843 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "202f9454-de0b-4a09-abb6-dacbea9b5fa4" (UID: "202f9454-de0b-4a09-abb6-dacbea9b5fa4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.890142 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.890627 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "202f9454-de0b-4a09-abb6-dacbea9b5fa4" (UID: "202f9454-de0b-4a09-abb6-dacbea9b5fa4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.890804 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "202f9454-de0b-4a09-abb6-dacbea9b5fa4" (UID: "202f9454-de0b-4a09-abb6-dacbea9b5fa4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.896770 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "202f9454-de0b-4a09-abb6-dacbea9b5fa4" (UID: "202f9454-de0b-4a09-abb6-dacbea9b5fa4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.897711 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/202f9454-de0b-4a09-abb6-dacbea9b5fa4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "202f9454-de0b-4a09-abb6-dacbea9b5fa4" (UID: "202f9454-de0b-4a09-abb6-dacbea9b5fa4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.898498 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/202f9454-de0b-4a09-abb6-dacbea9b5fa4-pod-info" (OuterVolumeSpecName: "pod-info") pod "202f9454-de0b-4a09-abb6-dacbea9b5fa4" (UID: "202f9454-de0b-4a09-abb6-dacbea9b5fa4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.918883 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "202f9454-de0b-4a09-abb6-dacbea9b5fa4" (UID: "202f9454-de0b-4a09-abb6-dacbea9b5fa4"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.919055 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-kube-api-access-hxnll" (OuterVolumeSpecName: "kube-api-access-hxnll") pod "202f9454-de0b-4a09-abb6-dacbea9b5fa4" (UID: "202f9454-de0b-4a09-abb6-dacbea9b5fa4"). InnerVolumeSpecName "kube-api-access-hxnll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.925758 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-config-data" (OuterVolumeSpecName: "config-data") pod "202f9454-de0b-4a09-abb6-dacbea9b5fa4" (UID: "202f9454-de0b-4a09-abb6-dacbea9b5fa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.958876 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-server-conf" (OuterVolumeSpecName: "server-conf") pod "202f9454-de0b-4a09-abb6-dacbea9b5fa4" (UID: "202f9454-de0b-4a09-abb6-dacbea9b5fa4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.992363 4781 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/202f9454-de0b-4a09-abb6-dacbea9b5fa4-pod-info\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.992419 4781 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-server-conf\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.992432 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.992444 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxnll\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-kube-api-access-hxnll\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.992473 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.992484 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.992495 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.992506 4781 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/202f9454-de0b-4a09-abb6-dacbea9b5fa4-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:17 crc kubenswrapper[4781]: I1208 20:27:17.992518 4781 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/202f9454-de0b-4a09-abb6-dacbea9b5fa4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.001462 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "202f9454-de0b-4a09-abb6-dacbea9b5fa4" (UID: "202f9454-de0b-4a09-abb6-dacbea9b5fa4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.020430 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.059132 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.094133 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.094162 4781 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/202f9454-de0b-4a09-abb6-dacbea9b5fa4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.136990 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9192ae66-92ec-4618-aecd-3ec306da8525" path="/var/lib/kubelet/pods/9192ae66-92ec-4618-aecd-3ec306da8525/volumes" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.153817 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20fc56e1-a1f6-4495-834a-41bfebf14aef","Type":"ContainerStarted","Data":"bcbb6d5f066affca2c31d19021fbaee6e1b3ecd2cb8e0bbc4c6eed74b490e8dd"} Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.156194 4781 generic.go:334] "Generic (PLEG): container finished" podID="202f9454-de0b-4a09-abb6-dacbea9b5fa4" containerID="0ac5f9ab1ce79b914a89ce8ace956b426166c3bc9dcb385340f4505993f3cf19" exitCode=0 Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.156254 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"202f9454-de0b-4a09-abb6-dacbea9b5fa4","Type":"ContainerDied","Data":"0ac5f9ab1ce79b914a89ce8ace956b426166c3bc9dcb385340f4505993f3cf19"} Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.156279 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"202f9454-de0b-4a09-abb6-dacbea9b5fa4","Type":"ContainerDied","Data":"7422fda4dbfbc56963c51e0c3feb53051d5a13bd68dc6ff62ddc57aa36101ded"} Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.156295 4781 scope.go:117] "RemoveContainer" containerID="0ac5f9ab1ce79b914a89ce8ace956b426166c3bc9dcb385340f4505993f3cf19" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.156376 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.188251 4781 scope.go:117] "RemoveContainer" containerID="3f89ebe6b8f5c4b9493038b512e65d98b842d3b16dc59eb207539df3c792e9d9" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.199063 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.208746 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.222487 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 20:27:18 crc kubenswrapper[4781]: E1208 20:27:18.223017 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="202f9454-de0b-4a09-abb6-dacbea9b5fa4" containerName="rabbitmq" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.223034 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="202f9454-de0b-4a09-abb6-dacbea9b5fa4" containerName="rabbitmq" Dec 08 20:27:18 crc kubenswrapper[4781]: E1208 20:27:18.223055 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="202f9454-de0b-4a09-abb6-dacbea9b5fa4" containerName="setup-container" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.223062 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="202f9454-de0b-4a09-abb6-dacbea9b5fa4" containerName="setup-container" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.223295 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="202f9454-de0b-4a09-abb6-dacbea9b5fa4" containerName="rabbitmq" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.229228 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.235662 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.235728 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.235662 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.235996 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.236022 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mkjkl" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.236211 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.236867 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.253874 4781 scope.go:117] "RemoveContainer" containerID="0ac5f9ab1ce79b914a89ce8ace956b426166c3bc9dcb385340f4505993f3cf19" Dec 08 20:27:18 crc kubenswrapper[4781]: E1208 20:27:18.254409 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac5f9ab1ce79b914a89ce8ace956b426166c3bc9dcb385340f4505993f3cf19\": container with ID starting with 0ac5f9ab1ce79b914a89ce8ace956b426166c3bc9dcb385340f4505993f3cf19 not found: ID does not exist" containerID="0ac5f9ab1ce79b914a89ce8ace956b426166c3bc9dcb385340f4505993f3cf19" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.254444 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac5f9ab1ce79b914a89ce8ace956b426166c3bc9dcb385340f4505993f3cf19"} err="failed to get container status \"0ac5f9ab1ce79b914a89ce8ace956b426166c3bc9dcb385340f4505993f3cf19\": rpc error: code = NotFound desc = could not find container \"0ac5f9ab1ce79b914a89ce8ace956b426166c3bc9dcb385340f4505993f3cf19\": container with ID starting with 0ac5f9ab1ce79b914a89ce8ace956b426166c3bc9dcb385340f4505993f3cf19 not found: ID does not exist" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.254466 4781 scope.go:117] "RemoveContainer" containerID="3f89ebe6b8f5c4b9493038b512e65d98b842d3b16dc59eb207539df3c792e9d9" Dec 08 20:27:18 crc kubenswrapper[4781]: E1208 20:27:18.255670 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f89ebe6b8f5c4b9493038b512e65d98b842d3b16dc59eb207539df3c792e9d9\": container with ID starting with 3f89ebe6b8f5c4b9493038b512e65d98b842d3b16dc59eb207539df3c792e9d9 not found: ID does not exist" containerID="3f89ebe6b8f5c4b9493038b512e65d98b842d3b16dc59eb207539df3c792e9d9" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.255730 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f89ebe6b8f5c4b9493038b512e65d98b842d3b16dc59eb207539df3c792e9d9"} err="failed to get container status \"3f89ebe6b8f5c4b9493038b512e65d98b842d3b16dc59eb207539df3c792e9d9\": rpc error: code = NotFound desc = could not find container \"3f89ebe6b8f5c4b9493038b512e65d98b842d3b16dc59eb207539df3c792e9d9\": container with ID starting with 3f89ebe6b8f5c4b9493038b512e65d98b842d3b16dc59eb207539df3c792e9d9 not found: ID does not exist" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.258415 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.399777 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.400075 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.400096 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.400124 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.400154 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.400183 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.400278 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.400348 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.400451 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.400532 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.400576 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gdhr\" (UniqueName: \"kubernetes.io/projected/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-kube-api-access-5gdhr\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.502753 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.502831 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.502866 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.502902 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.502977 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.503216 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.503248 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gdhr\" (UniqueName: \"kubernetes.io/projected/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-kube-api-access-5gdhr\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.503307 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.503352 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.503374 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.503408 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.503654 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.504493 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.504654 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.504885 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.505023 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.505356 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.508853 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.509619 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.513553 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.518615 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.522164 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gdhr\" (UniqueName: \"kubernetes.io/projected/984eb37b-647a-4e37-b4bc-6e7a3becb3ce-kube-api-access-5gdhr\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.550525 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"984eb37b-647a-4e37-b4bc-6e7a3becb3ce\") " pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.570655 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:18 crc kubenswrapper[4781]: I1208 20:27:18.966095 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.176338 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"984eb37b-647a-4e37-b4bc-6e7a3becb3ce","Type":"ContainerStarted","Data":"5e6be66577cc338634e2a460d74ac6f380f008fb76f2e5a0695435f7c41905d7"} Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.670955 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d96bc86b9-jlcjh"] Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.672850 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.675308 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.695597 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d96bc86b9-jlcjh"] Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.836113 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws6z9\" (UniqueName: \"kubernetes.io/projected/cf61dd97-783a-4929-9b4f-bed3715f731b-kube-api-access-ws6z9\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.836161 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-dns-svc\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.836190 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-dns-swift-storage-0\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.836226 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-config\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.836300 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-ovsdbserver-nb\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.836385 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-openstack-edpm-ipam\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.836423 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-ovsdbserver-sb\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.938497 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-ovsdbserver-sb\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.938564 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws6z9\" (UniqueName: \"kubernetes.io/projected/cf61dd97-783a-4929-9b4f-bed3715f731b-kube-api-access-ws6z9\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.938588 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-dns-svc\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.938615 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-dns-swift-storage-0\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.938646 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-config\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.938690 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-ovsdbserver-nb\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.938744 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-openstack-edpm-ipam\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.939565 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-ovsdbserver-sb\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.939832 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-ovsdbserver-nb\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.939834 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-dns-swift-storage-0\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.940141 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-dns-svc\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.940483 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-config\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.940945 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-openstack-edpm-ipam\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.955798 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws6z9\" (UniqueName: \"kubernetes.io/projected/cf61dd97-783a-4929-9b4f-bed3715f731b-kube-api-access-ws6z9\") pod \"dnsmasq-dns-d96bc86b9-jlcjh\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:19 crc kubenswrapper[4781]: I1208 20:27:19.992069 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:20 crc kubenswrapper[4781]: I1208 20:27:20.138851 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="202f9454-de0b-4a09-abb6-dacbea9b5fa4" path="/var/lib/kubelet/pods/202f9454-de0b-4a09-abb6-dacbea9b5fa4/volumes" Dec 08 20:27:20 crc kubenswrapper[4781]: I1208 20:27:20.187499 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20fc56e1-a1f6-4495-834a-41bfebf14aef","Type":"ContainerStarted","Data":"c9711184225b24cf7fd9d50bcf4ebfeab17c5e413fc6035df3fc12acf868ecee"} Dec 08 20:27:20 crc kubenswrapper[4781]: W1208 20:27:20.472027 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf61dd97_783a_4929_9b4f_bed3715f731b.slice/crio-3e81dfdce7b19ab7028b52edcd659eb1b499610a8eed11e0f8f8c4536b838796 WatchSource:0}: Error finding container 3e81dfdce7b19ab7028b52edcd659eb1b499610a8eed11e0f8f8c4536b838796: Status 404 returned error can't find the container with id 3e81dfdce7b19ab7028b52edcd659eb1b499610a8eed11e0f8f8c4536b838796 Dec 08 20:27:20 crc kubenswrapper[4781]: I1208 20:27:20.476190 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d96bc86b9-jlcjh"] Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.094445 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6bfvk"] Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.096868 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.109280 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6bfvk"] Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.197356 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"984eb37b-647a-4e37-b4bc-6e7a3becb3ce","Type":"ContainerStarted","Data":"df1b58716c2b1a4062744d0ca3ee7a923d4f699e52fbd5d9fcdfabf116ea966b"} Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.200110 4781 generic.go:334] "Generic (PLEG): container finished" podID="cf61dd97-783a-4929-9b4f-bed3715f731b" containerID="c2b2c0796bb8690bb5da90d5f7f57cfa87598c9e801cbf18ba6526b65cfc2263" exitCode=0 Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.200199 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" event={"ID":"cf61dd97-783a-4929-9b4f-bed3715f731b","Type":"ContainerDied","Data":"c2b2c0796bb8690bb5da90d5f7f57cfa87598c9e801cbf18ba6526b65cfc2263"} Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.200355 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" event={"ID":"cf61dd97-783a-4929-9b4f-bed3715f731b","Type":"ContainerStarted","Data":"3e81dfdce7b19ab7028b52edcd659eb1b499610a8eed11e0f8f8c4536b838796"} Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.264075 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e75beda-bddd-4fe0-b297-29dca9bbd695-catalog-content\") pod \"redhat-operators-6bfvk\" (UID: \"7e75beda-bddd-4fe0-b297-29dca9bbd695\") " pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.264258 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e75beda-bddd-4fe0-b297-29dca9bbd695-utilities\") pod \"redhat-operators-6bfvk\" (UID: \"7e75beda-bddd-4fe0-b297-29dca9bbd695\") " pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.264277 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sjpn\" (UniqueName: \"kubernetes.io/projected/7e75beda-bddd-4fe0-b297-29dca9bbd695-kube-api-access-9sjpn\") pod \"redhat-operators-6bfvk\" (UID: \"7e75beda-bddd-4fe0-b297-29dca9bbd695\") " pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.366512 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e75beda-bddd-4fe0-b297-29dca9bbd695-utilities\") pod \"redhat-operators-6bfvk\" (UID: \"7e75beda-bddd-4fe0-b297-29dca9bbd695\") " pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.366570 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sjpn\" (UniqueName: \"kubernetes.io/projected/7e75beda-bddd-4fe0-b297-29dca9bbd695-kube-api-access-9sjpn\") pod \"redhat-operators-6bfvk\" (UID: \"7e75beda-bddd-4fe0-b297-29dca9bbd695\") " pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.366667 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e75beda-bddd-4fe0-b297-29dca9bbd695-catalog-content\") pod \"redhat-operators-6bfvk\" (UID: \"7e75beda-bddd-4fe0-b297-29dca9bbd695\") " pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.367020 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e75beda-bddd-4fe0-b297-29dca9bbd695-utilities\") pod \"redhat-operators-6bfvk\" (UID: \"7e75beda-bddd-4fe0-b297-29dca9bbd695\") " pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.367133 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e75beda-bddd-4fe0-b297-29dca9bbd695-catalog-content\") pod \"redhat-operators-6bfvk\" (UID: \"7e75beda-bddd-4fe0-b297-29dca9bbd695\") " pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.384831 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sjpn\" (UniqueName: \"kubernetes.io/projected/7e75beda-bddd-4fe0-b297-29dca9bbd695-kube-api-access-9sjpn\") pod \"redhat-operators-6bfvk\" (UID: \"7e75beda-bddd-4fe0-b297-29dca9bbd695\") " pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.419361 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:21 crc kubenswrapper[4781]: W1208 20:27:21.909206 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e75beda_bddd_4fe0_b297_29dca9bbd695.slice/crio-e77f570cff2141829c16504a2d30da2fedfe58c2579ba604a7b5a7b6a7647397 WatchSource:0}: Error finding container e77f570cff2141829c16504a2d30da2fedfe58c2579ba604a7b5a7b6a7647397: Status 404 returned error can't find the container with id e77f570cff2141829c16504a2d30da2fedfe58c2579ba604a7b5a7b6a7647397 Dec 08 20:27:21 crc kubenswrapper[4781]: I1208 20:27:21.909531 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6bfvk"] Dec 08 20:27:22 crc kubenswrapper[4781]: I1208 20:27:22.231898 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" event={"ID":"cf61dd97-783a-4929-9b4f-bed3715f731b","Type":"ContainerStarted","Data":"3855e20d17d156d605e5ab3c7e486dbc67f4f4060e516083fdb0a7da21727941"} Dec 08 20:27:22 crc kubenswrapper[4781]: I1208 20:27:22.232328 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:22 crc kubenswrapper[4781]: I1208 20:27:22.242118 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bfvk" event={"ID":"7e75beda-bddd-4fe0-b297-29dca9bbd695","Type":"ContainerStarted","Data":"12af98693d90fd75f525890f17cffcec34a2d5d8e1ae3c314ee437133af28340"} Dec 08 20:27:22 crc kubenswrapper[4781]: I1208 20:27:22.242773 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bfvk" event={"ID":"7e75beda-bddd-4fe0-b297-29dca9bbd695","Type":"ContainerStarted","Data":"e77f570cff2141829c16504a2d30da2fedfe58c2579ba604a7b5a7b6a7647397"} Dec 08 20:27:22 crc kubenswrapper[4781]: I1208 20:27:22.258013 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" podStartSLOduration=3.257999454 podStartE2EDuration="3.257999454s" podCreationTimestamp="2025-12-08 20:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:27:22.257331445 +0000 UTC m=+1358.408614852" watchObservedRunningTime="2025-12-08 20:27:22.257999454 +0000 UTC m=+1358.409282831" Dec 08 20:27:23 crc kubenswrapper[4781]: I1208 20:27:23.254906 4781 generic.go:334] "Generic (PLEG): container finished" podID="7e75beda-bddd-4fe0-b297-29dca9bbd695" containerID="12af98693d90fd75f525890f17cffcec34a2d5d8e1ae3c314ee437133af28340" exitCode=0 Dec 08 20:27:23 crc kubenswrapper[4781]: I1208 20:27:23.255271 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bfvk" event={"ID":"7e75beda-bddd-4fe0-b297-29dca9bbd695","Type":"ContainerDied","Data":"12af98693d90fd75f525890f17cffcec34a2d5d8e1ae3c314ee437133af28340"} Dec 08 20:27:23 crc kubenswrapper[4781]: I1208 20:27:23.255449 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bfvk" event={"ID":"7e75beda-bddd-4fe0-b297-29dca9bbd695","Type":"ContainerStarted","Data":"24afc53c0c52a8e5519db8e74350f7a19c568357969d898db5c5a4379bb3e5f3"} Dec 08 20:27:25 crc kubenswrapper[4781]: I1208 20:27:25.275380 4781 generic.go:334] "Generic (PLEG): container finished" podID="7e75beda-bddd-4fe0-b297-29dca9bbd695" containerID="24afc53c0c52a8e5519db8e74350f7a19c568357969d898db5c5a4379bb3e5f3" exitCode=0 Dec 08 20:27:25 crc kubenswrapper[4781]: I1208 20:27:25.275586 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bfvk" event={"ID":"7e75beda-bddd-4fe0-b297-29dca9bbd695","Type":"ContainerDied","Data":"24afc53c0c52a8e5519db8e74350f7a19c568357969d898db5c5a4379bb3e5f3"} Dec 08 20:27:26 crc kubenswrapper[4781]: I1208 20:27:26.285199 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bfvk" event={"ID":"7e75beda-bddd-4fe0-b297-29dca9bbd695","Type":"ContainerStarted","Data":"aaec2a2d9cd339d6259039375a7d5cfde82fa9f4c2237d61641c67e4a2079421"} Dec 08 20:27:26 crc kubenswrapper[4781]: I1208 20:27:26.304934 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6bfvk" podStartSLOduration=1.738150686 podStartE2EDuration="5.304898159s" podCreationTimestamp="2025-12-08 20:27:21 +0000 UTC" firstStartedPulling="2025-12-08 20:27:22.2446364 +0000 UTC m=+1358.395919777" lastFinishedPulling="2025-12-08 20:27:25.811383873 +0000 UTC m=+1361.962667250" observedRunningTime="2025-12-08 20:27:26.300243226 +0000 UTC m=+1362.451526613" watchObservedRunningTime="2025-12-08 20:27:26.304898159 +0000 UTC m=+1362.456181536" Dec 08 20:27:29 crc kubenswrapper[4781]: I1208 20:27:29.994180 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.060350 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-4cltn"] Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.060629 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" podUID="1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" containerName="dnsmasq-dns" containerID="cri-o://a1cb65ffc720e3f84360697dd46a416438be98fbc4daa7532ab08873844db00d" gracePeriod=10 Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.231000 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6574f55bb5-rdnzp"] Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.257149 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.281039 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6574f55bb5-rdnzp"] Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.350411 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-dns-swift-storage-0\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.350846 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rpkm\" (UniqueName: \"kubernetes.io/projected/11821178-db83-4950-8900-5f6fcc68f184-kube-api-access-6rpkm\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.351017 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-config\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.351193 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-dns-svc\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.351325 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-ovsdbserver-sb\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.351686 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-ovsdbserver-nb\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.351844 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-openstack-edpm-ipam\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.453514 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rpkm\" (UniqueName: \"kubernetes.io/projected/11821178-db83-4950-8900-5f6fcc68f184-kube-api-access-6rpkm\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.453866 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-config\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.454072 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-dns-svc\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.454187 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-ovsdbserver-sb\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.454330 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-ovsdbserver-nb\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.454495 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-openstack-edpm-ipam\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.454659 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-dns-swift-storage-0\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.455103 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-config\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.455112 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-dns-svc\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.455570 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-dns-swift-storage-0\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.455610 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-ovsdbserver-nb\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.455659 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-openstack-edpm-ipam\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.456485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/11821178-db83-4950-8900-5f6fcc68f184-ovsdbserver-sb\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.477643 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rpkm\" (UniqueName: \"kubernetes.io/projected/11821178-db83-4950-8900-5f6fcc68f184-kube-api-access-6rpkm\") pod \"dnsmasq-dns-6574f55bb5-rdnzp\" (UID: \"11821178-db83-4950-8900-5f6fcc68f184\") " pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.589821 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:30 crc kubenswrapper[4781]: I1208 20:27:30.934324 4781 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" podUID="1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.198:5353: connect: connection refused" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.023610 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6574f55bb5-rdnzp"] Dec 08 20:27:31 crc kubenswrapper[4781]: W1208 20:27:31.025381 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11821178_db83_4950_8900_5f6fcc68f184.slice/crio-7abefafb3cb62d7007f43ec87cc8f8cb1d8c6845f3a40047d31beb4898931fd8 WatchSource:0}: Error finding container 7abefafb3cb62d7007f43ec87cc8f8cb1d8c6845f3a40047d31beb4898931fd8: Status 404 returned error can't find the container with id 7abefafb3cb62d7007f43ec87cc8f8cb1d8c6845f3a40047d31beb4898931fd8 Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.334859 4781 generic.go:334] "Generic (PLEG): container finished" podID="1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" containerID="a1cb65ffc720e3f84360697dd46a416438be98fbc4daa7532ab08873844db00d" exitCode=0 Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.335202 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" event={"ID":"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93","Type":"ContainerDied","Data":"a1cb65ffc720e3f84360697dd46a416438be98fbc4daa7532ab08873844db00d"} Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.338203 4781 generic.go:334] "Generic (PLEG): container finished" podID="11821178-db83-4950-8900-5f6fcc68f184" containerID="50ff7bb00f4792d67874f98a919349806429a20668026b7a83186e52ad624289" exitCode=0 Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.338230 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" event={"ID":"11821178-db83-4950-8900-5f6fcc68f184","Type":"ContainerDied","Data":"50ff7bb00f4792d67874f98a919349806429a20668026b7a83186e52ad624289"} Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.338249 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" event={"ID":"11821178-db83-4950-8900-5f6fcc68f184","Type":"ContainerStarted","Data":"7abefafb3cb62d7007f43ec87cc8f8cb1d8c6845f3a40047d31beb4898931fd8"} Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.420910 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.420979 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.561491 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.730017 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.793212 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-dns-svc\") pod \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.793314 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-ovsdbserver-sb\") pod \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.793388 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-ovsdbserver-nb\") pod \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.793431 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-dns-swift-storage-0\") pod \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.793453 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-config\") pod \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.793548 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6khxt\" (UniqueName: \"kubernetes.io/projected/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-kube-api-access-6khxt\") pod \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\" (UID: \"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93\") " Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.798954 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-kube-api-access-6khxt" (OuterVolumeSpecName: "kube-api-access-6khxt") pod "1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" (UID: "1fa8370f-5b9e-400f-aa71-1c50bd1a7d93"). InnerVolumeSpecName "kube-api-access-6khxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.882336 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" (UID: "1fa8370f-5b9e-400f-aa71-1c50bd1a7d93"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.892213 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" (UID: "1fa8370f-5b9e-400f-aa71-1c50bd1a7d93"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.896310 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.896340 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.896351 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6khxt\" (UniqueName: \"kubernetes.io/projected/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-kube-api-access-6khxt\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.897135 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-config" (OuterVolumeSpecName: "config") pod "1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" (UID: "1fa8370f-5b9e-400f-aa71-1c50bd1a7d93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.901313 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" (UID: "1fa8370f-5b9e-400f-aa71-1c50bd1a7d93"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.904844 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" (UID: "1fa8370f-5b9e-400f-aa71-1c50bd1a7d93"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.998664 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.999010 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:31 crc kubenswrapper[4781]: I1208 20:27:31.999022 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:32 crc kubenswrapper[4781]: I1208 20:27:32.349531 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" event={"ID":"1fa8370f-5b9e-400f-aa71-1c50bd1a7d93","Type":"ContainerDied","Data":"14a9dfd54e245b342d9f7f23a57db6510ce8990398dbaba8655f4177cbbace99"} Dec 08 20:27:32 crc kubenswrapper[4781]: I1208 20:27:32.350666 4781 scope.go:117] "RemoveContainer" containerID="a1cb65ffc720e3f84360697dd46a416438be98fbc4daa7532ab08873844db00d" Dec 08 20:27:32 crc kubenswrapper[4781]: I1208 20:27:32.349574 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9cbcb645-4cltn" Dec 08 20:27:32 crc kubenswrapper[4781]: I1208 20:27:32.351437 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" event={"ID":"11821178-db83-4950-8900-5f6fcc68f184","Type":"ContainerStarted","Data":"8ba4763c611e444e3ccdf57778ae43982d4278a7933285513660a8b27ccb4db8"} Dec 08 20:27:32 crc kubenswrapper[4781]: I1208 20:27:32.352466 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:32 crc kubenswrapper[4781]: I1208 20:27:32.369841 4781 scope.go:117] "RemoveContainer" containerID="f910a1a66581005ab1c91fccf2bada196fc44eeca6df8314e914f7c56dcabce0" Dec 08 20:27:32 crc kubenswrapper[4781]: I1208 20:27:32.386226 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" podStartSLOduration=2.386204872 podStartE2EDuration="2.386204872s" podCreationTimestamp="2025-12-08 20:27:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:27:32.378450089 +0000 UTC m=+1368.529733456" watchObservedRunningTime="2025-12-08 20:27:32.386204872 +0000 UTC m=+1368.537488259" Dec 08 20:27:32 crc kubenswrapper[4781]: I1208 20:27:32.398929 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-4cltn"] Dec 08 20:27:32 crc kubenswrapper[4781]: I1208 20:27:32.407691 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-4cltn"] Dec 08 20:27:32 crc kubenswrapper[4781]: I1208 20:27:32.412299 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:32 crc kubenswrapper[4781]: I1208 20:27:32.468523 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6bfvk"] Dec 08 20:27:34 crc kubenswrapper[4781]: I1208 20:27:34.136990 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" path="/var/lib/kubelet/pods/1fa8370f-5b9e-400f-aa71-1c50bd1a7d93/volumes" Dec 08 20:27:34 crc kubenswrapper[4781]: I1208 20:27:34.368781 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6bfvk" podUID="7e75beda-bddd-4fe0-b297-29dca9bbd695" containerName="registry-server" containerID="cri-o://aaec2a2d9cd339d6259039375a7d5cfde82fa9f4c2237d61641c67e4a2079421" gracePeriod=2 Dec 08 20:27:34 crc kubenswrapper[4781]: I1208 20:27:34.869964 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:34 crc kubenswrapper[4781]: I1208 20:27:34.956117 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sjpn\" (UniqueName: \"kubernetes.io/projected/7e75beda-bddd-4fe0-b297-29dca9bbd695-kube-api-access-9sjpn\") pod \"7e75beda-bddd-4fe0-b297-29dca9bbd695\" (UID: \"7e75beda-bddd-4fe0-b297-29dca9bbd695\") " Dec 08 20:27:34 crc kubenswrapper[4781]: I1208 20:27:34.956356 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e75beda-bddd-4fe0-b297-29dca9bbd695-utilities\") pod \"7e75beda-bddd-4fe0-b297-29dca9bbd695\" (UID: \"7e75beda-bddd-4fe0-b297-29dca9bbd695\") " Dec 08 20:27:34 crc kubenswrapper[4781]: I1208 20:27:34.956394 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e75beda-bddd-4fe0-b297-29dca9bbd695-catalog-content\") pod \"7e75beda-bddd-4fe0-b297-29dca9bbd695\" (UID: \"7e75beda-bddd-4fe0-b297-29dca9bbd695\") " Dec 08 20:27:34 crc kubenswrapper[4781]: I1208 20:27:34.957300 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e75beda-bddd-4fe0-b297-29dca9bbd695-utilities" (OuterVolumeSpecName: "utilities") pod "7e75beda-bddd-4fe0-b297-29dca9bbd695" (UID: "7e75beda-bddd-4fe0-b297-29dca9bbd695"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:27:34 crc kubenswrapper[4781]: I1208 20:27:34.969619 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e75beda-bddd-4fe0-b297-29dca9bbd695-kube-api-access-9sjpn" (OuterVolumeSpecName: "kube-api-access-9sjpn") pod "7e75beda-bddd-4fe0-b297-29dca9bbd695" (UID: "7e75beda-bddd-4fe0-b297-29dca9bbd695"). InnerVolumeSpecName "kube-api-access-9sjpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.058838 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e75beda-bddd-4fe0-b297-29dca9bbd695-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.059164 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sjpn\" (UniqueName: \"kubernetes.io/projected/7e75beda-bddd-4fe0-b297-29dca9bbd695-kube-api-access-9sjpn\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.075328 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e75beda-bddd-4fe0-b297-29dca9bbd695-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e75beda-bddd-4fe0-b297-29dca9bbd695" (UID: "7e75beda-bddd-4fe0-b297-29dca9bbd695"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.161700 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e75beda-bddd-4fe0-b297-29dca9bbd695-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.383147 4781 generic.go:334] "Generic (PLEG): container finished" podID="7e75beda-bddd-4fe0-b297-29dca9bbd695" containerID="aaec2a2d9cd339d6259039375a7d5cfde82fa9f4c2237d61641c67e4a2079421" exitCode=0 Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.383195 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bfvk" event={"ID":"7e75beda-bddd-4fe0-b297-29dca9bbd695","Type":"ContainerDied","Data":"aaec2a2d9cd339d6259039375a7d5cfde82fa9f4c2237d61641c67e4a2079421"} Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.383226 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6bfvk" event={"ID":"7e75beda-bddd-4fe0-b297-29dca9bbd695","Type":"ContainerDied","Data":"e77f570cff2141829c16504a2d30da2fedfe58c2579ba604a7b5a7b6a7647397"} Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.383247 4781 scope.go:117] "RemoveContainer" containerID="aaec2a2d9cd339d6259039375a7d5cfde82fa9f4c2237d61641c67e4a2079421" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.383430 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6bfvk" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.410959 4781 scope.go:117] "RemoveContainer" containerID="24afc53c0c52a8e5519db8e74350f7a19c568357969d898db5c5a4379bb3e5f3" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.417166 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6bfvk"] Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.425451 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6bfvk"] Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.449902 4781 scope.go:117] "RemoveContainer" containerID="12af98693d90fd75f525890f17cffcec34a2d5d8e1ae3c314ee437133af28340" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.495013 4781 scope.go:117] "RemoveContainer" containerID="aaec2a2d9cd339d6259039375a7d5cfde82fa9f4c2237d61641c67e4a2079421" Dec 08 20:27:35 crc kubenswrapper[4781]: E1208 20:27:35.495562 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaec2a2d9cd339d6259039375a7d5cfde82fa9f4c2237d61641c67e4a2079421\": container with ID starting with aaec2a2d9cd339d6259039375a7d5cfde82fa9f4c2237d61641c67e4a2079421 not found: ID does not exist" containerID="aaec2a2d9cd339d6259039375a7d5cfde82fa9f4c2237d61641c67e4a2079421" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.495635 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaec2a2d9cd339d6259039375a7d5cfde82fa9f4c2237d61641c67e4a2079421"} err="failed to get container status \"aaec2a2d9cd339d6259039375a7d5cfde82fa9f4c2237d61641c67e4a2079421\": rpc error: code = NotFound desc = could not find container \"aaec2a2d9cd339d6259039375a7d5cfde82fa9f4c2237d61641c67e4a2079421\": container with ID starting with aaec2a2d9cd339d6259039375a7d5cfde82fa9f4c2237d61641c67e4a2079421 not found: ID does not exist" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.495855 4781 scope.go:117] "RemoveContainer" containerID="24afc53c0c52a8e5519db8e74350f7a19c568357969d898db5c5a4379bb3e5f3" Dec 08 20:27:35 crc kubenswrapper[4781]: E1208 20:27:35.496663 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24afc53c0c52a8e5519db8e74350f7a19c568357969d898db5c5a4379bb3e5f3\": container with ID starting with 24afc53c0c52a8e5519db8e74350f7a19c568357969d898db5c5a4379bb3e5f3 not found: ID does not exist" containerID="24afc53c0c52a8e5519db8e74350f7a19c568357969d898db5c5a4379bb3e5f3" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.496749 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24afc53c0c52a8e5519db8e74350f7a19c568357969d898db5c5a4379bb3e5f3"} err="failed to get container status \"24afc53c0c52a8e5519db8e74350f7a19c568357969d898db5c5a4379bb3e5f3\": rpc error: code = NotFound desc = could not find container \"24afc53c0c52a8e5519db8e74350f7a19c568357969d898db5c5a4379bb3e5f3\": container with ID starting with 24afc53c0c52a8e5519db8e74350f7a19c568357969d898db5c5a4379bb3e5f3 not found: ID does not exist" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.496826 4781 scope.go:117] "RemoveContainer" containerID="12af98693d90fd75f525890f17cffcec34a2d5d8e1ae3c314ee437133af28340" Dec 08 20:27:35 crc kubenswrapper[4781]: E1208 20:27:35.497410 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12af98693d90fd75f525890f17cffcec34a2d5d8e1ae3c314ee437133af28340\": container with ID starting with 12af98693d90fd75f525890f17cffcec34a2d5d8e1ae3c314ee437133af28340 not found: ID does not exist" containerID="12af98693d90fd75f525890f17cffcec34a2d5d8e1ae3c314ee437133af28340" Dec 08 20:27:35 crc kubenswrapper[4781]: I1208 20:27:35.497466 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12af98693d90fd75f525890f17cffcec34a2d5d8e1ae3c314ee437133af28340"} err="failed to get container status \"12af98693d90fd75f525890f17cffcec34a2d5d8e1ae3c314ee437133af28340\": rpc error: code = NotFound desc = could not find container \"12af98693d90fd75f525890f17cffcec34a2d5d8e1ae3c314ee437133af28340\": container with ID starting with 12af98693d90fd75f525890f17cffcec34a2d5d8e1ae3c314ee437133af28340 not found: ID does not exist" Dec 08 20:27:36 crc kubenswrapper[4781]: I1208 20:27:36.137127 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e75beda-bddd-4fe0-b297-29dca9bbd695" path="/var/lib/kubelet/pods/7e75beda-bddd-4fe0-b297-29dca9bbd695/volumes" Dec 08 20:27:40 crc kubenswrapper[4781]: I1208 20:27:40.592100 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6574f55bb5-rdnzp" Dec 08 20:27:40 crc kubenswrapper[4781]: I1208 20:27:40.663423 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d96bc86b9-jlcjh"] Dec 08 20:27:40 crc kubenswrapper[4781]: I1208 20:27:40.663672 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" podUID="cf61dd97-783a-4929-9b4f-bed3715f731b" containerName="dnsmasq-dns" containerID="cri-o://3855e20d17d156d605e5ab3c7e486dbc67f4f4060e516083fdb0a7da21727941" gracePeriod=10 Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.178793 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.270398 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-dns-swift-storage-0\") pod \"cf61dd97-783a-4929-9b4f-bed3715f731b\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.270564 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws6z9\" (UniqueName: \"kubernetes.io/projected/cf61dd97-783a-4929-9b4f-bed3715f731b-kube-api-access-ws6z9\") pod \"cf61dd97-783a-4929-9b4f-bed3715f731b\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.270590 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-config\") pod \"cf61dd97-783a-4929-9b4f-bed3715f731b\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.270616 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-ovsdbserver-sb\") pod \"cf61dd97-783a-4929-9b4f-bed3715f731b\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.270638 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-openstack-edpm-ipam\") pod \"cf61dd97-783a-4929-9b4f-bed3715f731b\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.270688 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-dns-svc\") pod \"cf61dd97-783a-4929-9b4f-bed3715f731b\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.270736 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-ovsdbserver-nb\") pod \"cf61dd97-783a-4929-9b4f-bed3715f731b\" (UID: \"cf61dd97-783a-4929-9b4f-bed3715f731b\") " Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.276850 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf61dd97-783a-4929-9b4f-bed3715f731b-kube-api-access-ws6z9" (OuterVolumeSpecName: "kube-api-access-ws6z9") pod "cf61dd97-783a-4929-9b4f-bed3715f731b" (UID: "cf61dd97-783a-4929-9b4f-bed3715f731b"). InnerVolumeSpecName "kube-api-access-ws6z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.339888 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-config" (OuterVolumeSpecName: "config") pod "cf61dd97-783a-4929-9b4f-bed3715f731b" (UID: "cf61dd97-783a-4929-9b4f-bed3715f731b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.340051 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cf61dd97-783a-4929-9b4f-bed3715f731b" (UID: "cf61dd97-783a-4929-9b4f-bed3715f731b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.341480 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf61dd97-783a-4929-9b4f-bed3715f731b" (UID: "cf61dd97-783a-4929-9b4f-bed3715f731b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.348028 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf61dd97-783a-4929-9b4f-bed3715f731b" (UID: "cf61dd97-783a-4929-9b4f-bed3715f731b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.353029 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf61dd97-783a-4929-9b4f-bed3715f731b" (UID: "cf61dd97-783a-4929-9b4f-bed3715f731b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.356535 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "cf61dd97-783a-4929-9b4f-bed3715f731b" (UID: "cf61dd97-783a-4929-9b4f-bed3715f731b"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.373364 4781 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.373404 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws6z9\" (UniqueName: \"kubernetes.io/projected/cf61dd97-783a-4929-9b4f-bed3715f731b-kube-api-access-ws6z9\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.373416 4781 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-config\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.373425 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.373435 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.373443 4781 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.373451 4781 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf61dd97-783a-4929-9b4f-bed3715f731b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.435569 4781 generic.go:334] "Generic (PLEG): container finished" podID="cf61dd97-783a-4929-9b4f-bed3715f731b" containerID="3855e20d17d156d605e5ab3c7e486dbc67f4f4060e516083fdb0a7da21727941" exitCode=0 Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.435668 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" event={"ID":"cf61dd97-783a-4929-9b4f-bed3715f731b","Type":"ContainerDied","Data":"3855e20d17d156d605e5ab3c7e486dbc67f4f4060e516083fdb0a7da21727941"} Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.436073 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" event={"ID":"cf61dd97-783a-4929-9b4f-bed3715f731b","Type":"ContainerDied","Data":"3e81dfdce7b19ab7028b52edcd659eb1b499610a8eed11e0f8f8c4536b838796"} Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.435701 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d96bc86b9-jlcjh" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.436100 4781 scope.go:117] "RemoveContainer" containerID="3855e20d17d156d605e5ab3c7e486dbc67f4f4060e516083fdb0a7da21727941" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.468891 4781 scope.go:117] "RemoveContainer" containerID="c2b2c0796bb8690bb5da90d5f7f57cfa87598c9e801cbf18ba6526b65cfc2263" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.501725 4781 scope.go:117] "RemoveContainer" containerID="3855e20d17d156d605e5ab3c7e486dbc67f4f4060e516083fdb0a7da21727941" Dec 08 20:27:41 crc kubenswrapper[4781]: E1208 20:27:41.502286 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3855e20d17d156d605e5ab3c7e486dbc67f4f4060e516083fdb0a7da21727941\": container with ID starting with 3855e20d17d156d605e5ab3c7e486dbc67f4f4060e516083fdb0a7da21727941 not found: ID does not exist" containerID="3855e20d17d156d605e5ab3c7e486dbc67f4f4060e516083fdb0a7da21727941" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.502326 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3855e20d17d156d605e5ab3c7e486dbc67f4f4060e516083fdb0a7da21727941"} err="failed to get container status \"3855e20d17d156d605e5ab3c7e486dbc67f4f4060e516083fdb0a7da21727941\": rpc error: code = NotFound desc = could not find container \"3855e20d17d156d605e5ab3c7e486dbc67f4f4060e516083fdb0a7da21727941\": container with ID starting with 3855e20d17d156d605e5ab3c7e486dbc67f4f4060e516083fdb0a7da21727941 not found: ID does not exist" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.502354 4781 scope.go:117] "RemoveContainer" containerID="c2b2c0796bb8690bb5da90d5f7f57cfa87598c9e801cbf18ba6526b65cfc2263" Dec 08 20:27:41 crc kubenswrapper[4781]: E1208 20:27:41.503081 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2b2c0796bb8690bb5da90d5f7f57cfa87598c9e801cbf18ba6526b65cfc2263\": container with ID starting with c2b2c0796bb8690bb5da90d5f7f57cfa87598c9e801cbf18ba6526b65cfc2263 not found: ID does not exist" containerID="c2b2c0796bb8690bb5da90d5f7f57cfa87598c9e801cbf18ba6526b65cfc2263" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.503123 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2b2c0796bb8690bb5da90d5f7f57cfa87598c9e801cbf18ba6526b65cfc2263"} err="failed to get container status \"c2b2c0796bb8690bb5da90d5f7f57cfa87598c9e801cbf18ba6526b65cfc2263\": rpc error: code = NotFound desc = could not find container \"c2b2c0796bb8690bb5da90d5f7f57cfa87598c9e801cbf18ba6526b65cfc2263\": container with ID starting with c2b2c0796bb8690bb5da90d5f7f57cfa87598c9e801cbf18ba6526b65cfc2263 not found: ID does not exist" Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.503958 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d96bc86b9-jlcjh"] Dec 08 20:27:41 crc kubenswrapper[4781]: I1208 20:27:41.515694 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d96bc86b9-jlcjh"] Dec 08 20:27:42 crc kubenswrapper[4781]: I1208 20:27:42.136876 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf61dd97-783a-4929-9b4f-bed3715f731b" path="/var/lib/kubelet/pods/cf61dd97-783a-4929-9b4f-bed3715f731b/volumes" Dec 08 20:27:52 crc kubenswrapper[4781]: I1208 20:27:52.547223 4781 generic.go:334] "Generic (PLEG): container finished" podID="20fc56e1-a1f6-4495-834a-41bfebf14aef" containerID="c9711184225b24cf7fd9d50bcf4ebfeab17c5e413fc6035df3fc12acf868ecee" exitCode=0 Dec 08 20:27:52 crc kubenswrapper[4781]: I1208 20:27:52.547313 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20fc56e1-a1f6-4495-834a-41bfebf14aef","Type":"ContainerDied","Data":"c9711184225b24cf7fd9d50bcf4ebfeab17c5e413fc6035df3fc12acf868ecee"} Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.557323 4781 generic.go:334] "Generic (PLEG): container finished" podID="984eb37b-647a-4e37-b4bc-6e7a3becb3ce" containerID="df1b58716c2b1a4062744d0ca3ee7a923d4f699e52fbd5d9fcdfabf116ea966b" exitCode=0 Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.557353 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"984eb37b-647a-4e37-b4bc-6e7a3becb3ce","Type":"ContainerDied","Data":"df1b58716c2b1a4062744d0ca3ee7a923d4f699e52fbd5d9fcdfabf116ea966b"} Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.560295 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20fc56e1-a1f6-4495-834a-41bfebf14aef","Type":"ContainerStarted","Data":"bdd81358fcd8731cf9aa446415dac19c9b6b0a83d2e42895812229002f9da60e"} Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.560501 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.625882 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.625860039 podStartE2EDuration="36.625860039s" podCreationTimestamp="2025-12-08 20:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:27:53.612162316 +0000 UTC m=+1389.763445693" watchObservedRunningTime="2025-12-08 20:27:53.625860039 +0000 UTC m=+1389.777143426" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.863531 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw"] Dec 08 20:27:53 crc kubenswrapper[4781]: E1208 20:27:53.864283 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" containerName="init" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.864307 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" containerName="init" Dec 08 20:27:53 crc kubenswrapper[4781]: E1208 20:27:53.864335 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf61dd97-783a-4929-9b4f-bed3715f731b" containerName="init" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.864344 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf61dd97-783a-4929-9b4f-bed3715f731b" containerName="init" Dec 08 20:27:53 crc kubenswrapper[4781]: E1208 20:27:53.864355 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf61dd97-783a-4929-9b4f-bed3715f731b" containerName="dnsmasq-dns" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.864363 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf61dd97-783a-4929-9b4f-bed3715f731b" containerName="dnsmasq-dns" Dec 08 20:27:53 crc kubenswrapper[4781]: E1208 20:27:53.864377 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e75beda-bddd-4fe0-b297-29dca9bbd695" containerName="extract-content" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.864384 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e75beda-bddd-4fe0-b297-29dca9bbd695" containerName="extract-content" Dec 08 20:27:53 crc kubenswrapper[4781]: E1208 20:27:53.864407 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e75beda-bddd-4fe0-b297-29dca9bbd695" containerName="extract-utilities" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.864417 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e75beda-bddd-4fe0-b297-29dca9bbd695" containerName="extract-utilities" Dec 08 20:27:53 crc kubenswrapper[4781]: E1208 20:27:53.864435 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e75beda-bddd-4fe0-b297-29dca9bbd695" containerName="registry-server" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.864442 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e75beda-bddd-4fe0-b297-29dca9bbd695" containerName="registry-server" Dec 08 20:27:53 crc kubenswrapper[4781]: E1208 20:27:53.864460 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" containerName="dnsmasq-dns" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.864468 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" containerName="dnsmasq-dns" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.864693 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf61dd97-783a-4929-9b4f-bed3715f731b" containerName="dnsmasq-dns" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.864714 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fa8370f-5b9e-400f-aa71-1c50bd1a7d93" containerName="dnsmasq-dns" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.864728 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e75beda-bddd-4fe0-b297-29dca9bbd695" containerName="registry-server" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.865506 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.867854 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.868193 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.868279 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.868374 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.873234 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw"] Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.932538 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.932663 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.932749 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:53 crc kubenswrapper[4781]: I1208 20:27:53.932853 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxvl6\" (UniqueName: \"kubernetes.io/projected/793b9e15-75d8-49b1-8261-cc624d33aaea-kube-api-access-hxvl6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:54 crc kubenswrapper[4781]: I1208 20:27:54.034294 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxvl6\" (UniqueName: \"kubernetes.io/projected/793b9e15-75d8-49b1-8261-cc624d33aaea-kube-api-access-hxvl6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:54 crc kubenswrapper[4781]: I1208 20:27:54.034413 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:54 crc kubenswrapper[4781]: I1208 20:27:54.034458 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:54 crc kubenswrapper[4781]: I1208 20:27:54.034498 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:54 crc kubenswrapper[4781]: I1208 20:27:54.037681 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:54 crc kubenswrapper[4781]: I1208 20:27:54.037714 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:54 crc kubenswrapper[4781]: I1208 20:27:54.041352 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:54 crc kubenswrapper[4781]: I1208 20:27:54.050035 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxvl6\" (UniqueName: \"kubernetes.io/projected/793b9e15-75d8-49b1-8261-cc624d33aaea-kube-api-access-hxvl6\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:54 crc kubenswrapper[4781]: I1208 20:27:54.182215 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:27:54 crc kubenswrapper[4781]: I1208 20:27:54.573763 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"984eb37b-647a-4e37-b4bc-6e7a3becb3ce","Type":"ContainerStarted","Data":"db8b1ba1fbae4323598a8edee4404d4bf3358d9e0315b1ece978df5d99052e39"} Dec 08 20:27:54 crc kubenswrapper[4781]: I1208 20:27:54.574237 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:27:54 crc kubenswrapper[4781]: I1208 20:27:54.601849 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.601833374 podStartE2EDuration="36.601833374s" podCreationTimestamp="2025-12-08 20:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:27:54.597739626 +0000 UTC m=+1390.749023003" watchObservedRunningTime="2025-12-08 20:27:54.601833374 +0000 UTC m=+1390.753116741" Dec 08 20:27:55 crc kubenswrapper[4781]: I1208 20:27:55.252310 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw"] Dec 08 20:27:55 crc kubenswrapper[4781]: I1208 20:27:55.585297 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" event={"ID":"793b9e15-75d8-49b1-8261-cc624d33aaea","Type":"ContainerStarted","Data":"e817c7b9f0d1abce641521c07f1c8717f4bcd7096d3dc092759550bf5b59d4ac"} Dec 08 20:28:05 crc kubenswrapper[4781]: I1208 20:28:05.709858 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" event={"ID":"793b9e15-75d8-49b1-8261-cc624d33aaea","Type":"ContainerStarted","Data":"d33c6dda22260985efaa5032b8dbbfb1d0de0e5c607409d9a74146211e051570"} Dec 08 20:28:05 crc kubenswrapper[4781]: I1208 20:28:05.728114 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" podStartSLOduration=3.283581142 podStartE2EDuration="12.72809541s" podCreationTimestamp="2025-12-08 20:27:53 +0000 UTC" firstStartedPulling="2025-12-08 20:27:55.266842396 +0000 UTC m=+1391.418125773" lastFinishedPulling="2025-12-08 20:28:04.711356664 +0000 UTC m=+1400.862640041" observedRunningTime="2025-12-08 20:28:05.724853657 +0000 UTC m=+1401.876137034" watchObservedRunningTime="2025-12-08 20:28:05.72809541 +0000 UTC m=+1401.879378787" Dec 08 20:28:07 crc kubenswrapper[4781]: I1208 20:28:07.559145 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 08 20:28:08 crc kubenswrapper[4781]: I1208 20:28:08.574105 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 08 20:28:18 crc kubenswrapper[4781]: I1208 20:28:18.284140 4781 generic.go:334] "Generic (PLEG): container finished" podID="793b9e15-75d8-49b1-8261-cc624d33aaea" containerID="d33c6dda22260985efaa5032b8dbbfb1d0de0e5c607409d9a74146211e051570" exitCode=0 Dec 08 20:28:18 crc kubenswrapper[4781]: I1208 20:28:18.284220 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" event={"ID":"793b9e15-75d8-49b1-8261-cc624d33aaea","Type":"ContainerDied","Data":"d33c6dda22260985efaa5032b8dbbfb1d0de0e5c607409d9a74146211e051570"} Dec 08 20:28:19 crc kubenswrapper[4781]: I1208 20:28:19.679930 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:28:19 crc kubenswrapper[4781]: I1208 20:28:19.866611 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxvl6\" (UniqueName: \"kubernetes.io/projected/793b9e15-75d8-49b1-8261-cc624d33aaea-kube-api-access-hxvl6\") pod \"793b9e15-75d8-49b1-8261-cc624d33aaea\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " Dec 08 20:28:19 crc kubenswrapper[4781]: I1208 20:28:19.867171 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-ssh-key\") pod \"793b9e15-75d8-49b1-8261-cc624d33aaea\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " Dec 08 20:28:19 crc kubenswrapper[4781]: I1208 20:28:19.867223 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-inventory\") pod \"793b9e15-75d8-49b1-8261-cc624d33aaea\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " Dec 08 20:28:19 crc kubenswrapper[4781]: I1208 20:28:19.867297 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-repo-setup-combined-ca-bundle\") pod \"793b9e15-75d8-49b1-8261-cc624d33aaea\" (UID: \"793b9e15-75d8-49b1-8261-cc624d33aaea\") " Dec 08 20:28:19 crc kubenswrapper[4781]: I1208 20:28:19.877356 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/793b9e15-75d8-49b1-8261-cc624d33aaea-kube-api-access-hxvl6" (OuterVolumeSpecName: "kube-api-access-hxvl6") pod "793b9e15-75d8-49b1-8261-cc624d33aaea" (UID: "793b9e15-75d8-49b1-8261-cc624d33aaea"). InnerVolumeSpecName "kube-api-access-hxvl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:28:19 crc kubenswrapper[4781]: I1208 20:28:19.877748 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "793b9e15-75d8-49b1-8261-cc624d33aaea" (UID: "793b9e15-75d8-49b1-8261-cc624d33aaea"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:28:19 crc kubenswrapper[4781]: I1208 20:28:19.905064 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-inventory" (OuterVolumeSpecName: "inventory") pod "793b9e15-75d8-49b1-8261-cc624d33aaea" (UID: "793b9e15-75d8-49b1-8261-cc624d33aaea"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:28:19 crc kubenswrapper[4781]: I1208 20:28:19.915909 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "793b9e15-75d8-49b1-8261-cc624d33aaea" (UID: "793b9e15-75d8-49b1-8261-cc624d33aaea"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:28:19 crc kubenswrapper[4781]: I1208 20:28:19.969573 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:28:19 crc kubenswrapper[4781]: I1208 20:28:19.969614 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:28:19 crc kubenswrapper[4781]: I1208 20:28:19.969627 4781 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793b9e15-75d8-49b1-8261-cc624d33aaea-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:28:19 crc kubenswrapper[4781]: I1208 20:28:19.969641 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxvl6\" (UniqueName: \"kubernetes.io/projected/793b9e15-75d8-49b1-8261-cc624d33aaea-kube-api-access-hxvl6\") on node \"crc\" DevicePath \"\"" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.323239 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" event={"ID":"793b9e15-75d8-49b1-8261-cc624d33aaea","Type":"ContainerDied","Data":"e817c7b9f0d1abce641521c07f1c8717f4bcd7096d3dc092759550bf5b59d4ac"} Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.323296 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e817c7b9f0d1abce641521c07f1c8717f4bcd7096d3dc092759550bf5b59d4ac" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.323681 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.404773 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd"] Dec 08 20:28:20 crc kubenswrapper[4781]: E1208 20:28:20.405265 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="793b9e15-75d8-49b1-8261-cc624d33aaea" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.405292 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="793b9e15-75d8-49b1-8261-cc624d33aaea" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.405719 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="793b9e15-75d8-49b1-8261-cc624d33aaea" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.406592 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.409059 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.409375 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.409581 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.413624 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.423672 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd"] Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.580991 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jr5wd\" (UID: \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.581246 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc2rl\" (UniqueName: \"kubernetes.io/projected/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-kube-api-access-rc2rl\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jr5wd\" (UID: \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.581565 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jr5wd\" (UID: \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.683563 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jr5wd\" (UID: \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.683682 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc2rl\" (UniqueName: \"kubernetes.io/projected/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-kube-api-access-rc2rl\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jr5wd\" (UID: \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.683786 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jr5wd\" (UID: \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.688178 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jr5wd\" (UID: \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.691597 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jr5wd\" (UID: \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.700997 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc2rl\" (UniqueName: \"kubernetes.io/projected/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-kube-api-access-rc2rl\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-jr5wd\" (UID: \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" Dec 08 20:28:20 crc kubenswrapper[4781]: I1208 20:28:20.730069 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" Dec 08 20:28:21 crc kubenswrapper[4781]: I1208 20:28:21.031862 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd"] Dec 08 20:28:21 crc kubenswrapper[4781]: I1208 20:28:21.336243 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" event={"ID":"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39","Type":"ContainerStarted","Data":"568fb4ae1b522c6585866928381f7067c0317e424ff39a6e7c9e38475d1add0b"} Dec 08 20:28:22 crc kubenswrapper[4781]: I1208 20:28:22.352243 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" event={"ID":"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39","Type":"ContainerStarted","Data":"d5a23b34894fd0bcf6274beaf6a520b2b00ea24554d94344954969ccbbb0ce4c"} Dec 08 20:28:22 crc kubenswrapper[4781]: I1208 20:28:22.383720 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" podStartSLOduration=1.921028494 podStartE2EDuration="2.383700943s" podCreationTimestamp="2025-12-08 20:28:20 +0000 UTC" firstStartedPulling="2025-12-08 20:28:21.038658408 +0000 UTC m=+1417.189941785" lastFinishedPulling="2025-12-08 20:28:21.501330857 +0000 UTC m=+1417.652614234" observedRunningTime="2025-12-08 20:28:22.375184029 +0000 UTC m=+1418.526467406" watchObservedRunningTime="2025-12-08 20:28:22.383700943 +0000 UTC m=+1418.534984320" Dec 08 20:28:24 crc kubenswrapper[4781]: I1208 20:28:24.377272 4781 generic.go:334] "Generic (PLEG): container finished" podID="3cfbf369-bbb7-4b9e-980d-32fe2cf76c39" containerID="d5a23b34894fd0bcf6274beaf6a520b2b00ea24554d94344954969ccbbb0ce4c" exitCode=0 Dec 08 20:28:24 crc kubenswrapper[4781]: I1208 20:28:24.377846 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" event={"ID":"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39","Type":"ContainerDied","Data":"d5a23b34894fd0bcf6274beaf6a520b2b00ea24554d94344954969ccbbb0ce4c"} Dec 08 20:28:25 crc kubenswrapper[4781]: I1208 20:28:25.756071 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" Dec 08 20:28:25 crc kubenswrapper[4781]: I1208 20:28:25.884678 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-ssh-key\") pod \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\" (UID: \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\") " Dec 08 20:28:25 crc kubenswrapper[4781]: I1208 20:28:25.884757 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc2rl\" (UniqueName: \"kubernetes.io/projected/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-kube-api-access-rc2rl\") pod \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\" (UID: \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\") " Dec 08 20:28:25 crc kubenswrapper[4781]: I1208 20:28:25.885052 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-inventory\") pod \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\" (UID: \"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39\") " Dec 08 20:28:25 crc kubenswrapper[4781]: I1208 20:28:25.890898 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-kube-api-access-rc2rl" (OuterVolumeSpecName: "kube-api-access-rc2rl") pod "3cfbf369-bbb7-4b9e-980d-32fe2cf76c39" (UID: "3cfbf369-bbb7-4b9e-980d-32fe2cf76c39"). InnerVolumeSpecName "kube-api-access-rc2rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:28:25 crc kubenswrapper[4781]: I1208 20:28:25.920181 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-inventory" (OuterVolumeSpecName: "inventory") pod "3cfbf369-bbb7-4b9e-980d-32fe2cf76c39" (UID: "3cfbf369-bbb7-4b9e-980d-32fe2cf76c39"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:28:25 crc kubenswrapper[4781]: I1208 20:28:25.921066 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3cfbf369-bbb7-4b9e-980d-32fe2cf76c39" (UID: "3cfbf369-bbb7-4b9e-980d-32fe2cf76c39"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:28:25 crc kubenswrapper[4781]: I1208 20:28:25.987840 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:28:25 crc kubenswrapper[4781]: I1208 20:28:25.988364 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:28:25 crc kubenswrapper[4781]: I1208 20:28:25.988424 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc2rl\" (UniqueName: \"kubernetes.io/projected/3cfbf369-bbb7-4b9e-980d-32fe2cf76c39-kube-api-access-rc2rl\") on node \"crc\" DevicePath \"\"" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.395002 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" event={"ID":"3cfbf369-bbb7-4b9e-980d-32fe2cf76c39","Type":"ContainerDied","Data":"568fb4ae1b522c6585866928381f7067c0317e424ff39a6e7c9e38475d1add0b"} Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.395043 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="568fb4ae1b522c6585866928381f7067c0317e424ff39a6e7c9e38475d1add0b" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.395092 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-jr5wd" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.475884 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq"] Dec 08 20:28:26 crc kubenswrapper[4781]: E1208 20:28:26.477826 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cfbf369-bbb7-4b9e-980d-32fe2cf76c39" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.477854 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cfbf369-bbb7-4b9e-980d-32fe2cf76c39" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.479353 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cfbf369-bbb7-4b9e-980d-32fe2cf76c39" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.480107 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.486809 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq"] Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.487624 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.487860 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.487906 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.488029 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.599954 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.600169 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.600386 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.600693 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nhdq\" (UniqueName: \"kubernetes.io/projected/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-kube-api-access-5nhdq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.702056 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.702145 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.702224 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nhdq\" (UniqueName: \"kubernetes.io/projected/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-kube-api-access-5nhdq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.702255 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.707712 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.708095 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.708358 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.728418 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nhdq\" (UniqueName: \"kubernetes.io/projected/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-kube-api-access-5nhdq\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:26 crc kubenswrapper[4781]: I1208 20:28:26.840966 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:28:27 crc kubenswrapper[4781]: I1208 20:28:27.332168 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq"] Dec 08 20:28:27 crc kubenswrapper[4781]: I1208 20:28:27.403894 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" event={"ID":"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c","Type":"ContainerStarted","Data":"3060f77c7dfe752111dd1369fdcbc4c9b57a651c99b6d5fb1bb1a7eaed50388a"} Dec 08 20:28:28 crc kubenswrapper[4781]: I1208 20:28:28.412816 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" event={"ID":"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c","Type":"ContainerStarted","Data":"823d4112e21adafdb7f6345afa8ddebadc66881e72de208a2aac384b0778b790"} Dec 08 20:28:28 crc kubenswrapper[4781]: I1208 20:28:28.428693 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" podStartSLOduration=2.022703964 podStartE2EDuration="2.428676405s" podCreationTimestamp="2025-12-08 20:28:26 +0000 UTC" firstStartedPulling="2025-12-08 20:28:27.333197366 +0000 UTC m=+1423.484480743" lastFinishedPulling="2025-12-08 20:28:27.739169807 +0000 UTC m=+1423.890453184" observedRunningTime="2025-12-08 20:28:28.424829684 +0000 UTC m=+1424.576113061" watchObservedRunningTime="2025-12-08 20:28:28.428676405 +0000 UTC m=+1424.579959772" Dec 08 20:28:29 crc kubenswrapper[4781]: I1208 20:28:29.948615 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:28:29 crc kubenswrapper[4781]: I1208 20:28:29.948705 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:28:59 crc kubenswrapper[4781]: I1208 20:28:59.948475 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:28:59 crc kubenswrapper[4781]: I1208 20:28:59.949132 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:29:11 crc kubenswrapper[4781]: I1208 20:29:11.315640 4781 scope.go:117] "RemoveContainer" containerID="ba45309c5c5390af0106f9b0f6201b3213380d92620e4c59a935e3306863d0bf" Dec 08 20:29:11 crc kubenswrapper[4781]: I1208 20:29:11.366998 4781 scope.go:117] "RemoveContainer" containerID="54ea7d959f9d995a74b1c8d9cb6ee41eabf4c69c6381bc42921a17521ffb6825" Dec 08 20:29:29 crc kubenswrapper[4781]: I1208 20:29:29.948576 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:29:29 crc kubenswrapper[4781]: I1208 20:29:29.949238 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:29:29 crc kubenswrapper[4781]: I1208 20:29:29.949293 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:29:29 crc kubenswrapper[4781]: I1208 20:29:29.950065 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9c0ebca79495ed59d45aa666bc740d910c70a10325b1eaed4b6173c141374c9"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 20:29:29 crc kubenswrapper[4781]: I1208 20:29:29.950120 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://a9c0ebca79495ed59d45aa666bc740d910c70a10325b1eaed4b6173c141374c9" gracePeriod=600 Dec 08 20:29:30 crc kubenswrapper[4781]: I1208 20:29:30.087787 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="a9c0ebca79495ed59d45aa666bc740d910c70a10325b1eaed4b6173c141374c9" exitCode=0 Dec 08 20:29:30 crc kubenswrapper[4781]: I1208 20:29:30.087833 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"a9c0ebca79495ed59d45aa666bc740d910c70a10325b1eaed4b6173c141374c9"} Dec 08 20:29:30 crc kubenswrapper[4781]: I1208 20:29:30.087869 4781 scope.go:117] "RemoveContainer" containerID="4fea80c9d2853786513b0b8aceae77577c3f9f5cebb3cd832d508e012c04f4da" Dec 08 20:29:31 crc kubenswrapper[4781]: I1208 20:29:31.099523 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c"} Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.151154 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb"] Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.153065 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.155313 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.156049 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.160941 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb"] Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.346013 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6xrv\" (UniqueName: \"kubernetes.io/projected/e1df355b-c158-4578-ae10-0690aa3cf69c-kube-api-access-j6xrv\") pod \"collect-profiles-29420430-4s5rb\" (UID: \"e1df355b-c158-4578-ae10-0690aa3cf69c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.346472 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1df355b-c158-4578-ae10-0690aa3cf69c-secret-volume\") pod \"collect-profiles-29420430-4s5rb\" (UID: \"e1df355b-c158-4578-ae10-0690aa3cf69c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.346553 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1df355b-c158-4578-ae10-0690aa3cf69c-config-volume\") pod \"collect-profiles-29420430-4s5rb\" (UID: \"e1df355b-c158-4578-ae10-0690aa3cf69c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.448730 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1df355b-c158-4578-ae10-0690aa3cf69c-secret-volume\") pod \"collect-profiles-29420430-4s5rb\" (UID: \"e1df355b-c158-4578-ae10-0690aa3cf69c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.449126 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1df355b-c158-4578-ae10-0690aa3cf69c-config-volume\") pod \"collect-profiles-29420430-4s5rb\" (UID: \"e1df355b-c158-4578-ae10-0690aa3cf69c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.449245 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6xrv\" (UniqueName: \"kubernetes.io/projected/e1df355b-c158-4578-ae10-0690aa3cf69c-kube-api-access-j6xrv\") pod \"collect-profiles-29420430-4s5rb\" (UID: \"e1df355b-c158-4578-ae10-0690aa3cf69c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.451889 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1df355b-c158-4578-ae10-0690aa3cf69c-config-volume\") pod \"collect-profiles-29420430-4s5rb\" (UID: \"e1df355b-c158-4578-ae10-0690aa3cf69c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.464619 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1df355b-c158-4578-ae10-0690aa3cf69c-secret-volume\") pod \"collect-profiles-29420430-4s5rb\" (UID: \"e1df355b-c158-4578-ae10-0690aa3cf69c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.475576 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6xrv\" (UniqueName: \"kubernetes.io/projected/e1df355b-c158-4578-ae10-0690aa3cf69c-kube-api-access-j6xrv\") pod \"collect-profiles-29420430-4s5rb\" (UID: \"e1df355b-c158-4578-ae10-0690aa3cf69c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.481223 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" Dec 08 20:30:00 crc kubenswrapper[4781]: I1208 20:30:00.940094 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb"] Dec 08 20:30:01 crc kubenswrapper[4781]: I1208 20:30:01.396540 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" event={"ID":"e1df355b-c158-4578-ae10-0690aa3cf69c","Type":"ContainerStarted","Data":"c00812a3f1376b90211151589226efc95081902ddf66f14532a003a949794817"} Dec 08 20:30:01 crc kubenswrapper[4781]: I1208 20:30:01.396592 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" event={"ID":"e1df355b-c158-4578-ae10-0690aa3cf69c","Type":"ContainerStarted","Data":"5a1fd4ebd57c9e07c464391d4886aec8652f6fd105ca10e161e115484184e6b7"} Dec 08 20:30:01 crc kubenswrapper[4781]: I1208 20:30:01.425601 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" podStartSLOduration=1.425584778 podStartE2EDuration="1.425584778s" podCreationTimestamp="2025-12-08 20:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 20:30:01.419704869 +0000 UTC m=+1517.570988246" watchObservedRunningTime="2025-12-08 20:30:01.425584778 +0000 UTC m=+1517.576868155" Dec 08 20:30:02 crc kubenswrapper[4781]: I1208 20:30:02.407682 4781 generic.go:334] "Generic (PLEG): container finished" podID="e1df355b-c158-4578-ae10-0690aa3cf69c" containerID="c00812a3f1376b90211151589226efc95081902ddf66f14532a003a949794817" exitCode=0 Dec 08 20:30:02 crc kubenswrapper[4781]: I1208 20:30:02.407732 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" event={"ID":"e1df355b-c158-4578-ae10-0690aa3cf69c","Type":"ContainerDied","Data":"c00812a3f1376b90211151589226efc95081902ddf66f14532a003a949794817"} Dec 08 20:30:03 crc kubenswrapper[4781]: I1208 20:30:03.745626 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" Dec 08 20:30:03 crc kubenswrapper[4781]: I1208 20:30:03.875006 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6xrv\" (UniqueName: \"kubernetes.io/projected/e1df355b-c158-4578-ae10-0690aa3cf69c-kube-api-access-j6xrv\") pod \"e1df355b-c158-4578-ae10-0690aa3cf69c\" (UID: \"e1df355b-c158-4578-ae10-0690aa3cf69c\") " Dec 08 20:30:03 crc kubenswrapper[4781]: I1208 20:30:03.875393 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1df355b-c158-4578-ae10-0690aa3cf69c-secret-volume\") pod \"e1df355b-c158-4578-ae10-0690aa3cf69c\" (UID: \"e1df355b-c158-4578-ae10-0690aa3cf69c\") " Dec 08 20:30:03 crc kubenswrapper[4781]: I1208 20:30:03.875452 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1df355b-c158-4578-ae10-0690aa3cf69c-config-volume\") pod \"e1df355b-c158-4578-ae10-0690aa3cf69c\" (UID: \"e1df355b-c158-4578-ae10-0690aa3cf69c\") " Dec 08 20:30:03 crc kubenswrapper[4781]: I1208 20:30:03.876411 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1df355b-c158-4578-ae10-0690aa3cf69c-config-volume" (OuterVolumeSpecName: "config-volume") pod "e1df355b-c158-4578-ae10-0690aa3cf69c" (UID: "e1df355b-c158-4578-ae10-0690aa3cf69c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:30:03 crc kubenswrapper[4781]: I1208 20:30:03.881477 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1df355b-c158-4578-ae10-0690aa3cf69c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e1df355b-c158-4578-ae10-0690aa3cf69c" (UID: "e1df355b-c158-4578-ae10-0690aa3cf69c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:30:03 crc kubenswrapper[4781]: I1208 20:30:03.882225 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1df355b-c158-4578-ae10-0690aa3cf69c-kube-api-access-j6xrv" (OuterVolumeSpecName: "kube-api-access-j6xrv") pod "e1df355b-c158-4578-ae10-0690aa3cf69c" (UID: "e1df355b-c158-4578-ae10-0690aa3cf69c"). InnerVolumeSpecName "kube-api-access-j6xrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:30:03 crc kubenswrapper[4781]: I1208 20:30:03.977709 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1df355b-c158-4578-ae10-0690aa3cf69c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 20:30:03 crc kubenswrapper[4781]: I1208 20:30:03.977746 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1df355b-c158-4578-ae10-0690aa3cf69c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 20:30:03 crc kubenswrapper[4781]: I1208 20:30:03.977757 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6xrv\" (UniqueName: \"kubernetes.io/projected/e1df355b-c158-4578-ae10-0690aa3cf69c-kube-api-access-j6xrv\") on node \"crc\" DevicePath \"\"" Dec 08 20:30:04 crc kubenswrapper[4781]: I1208 20:30:04.426234 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" event={"ID":"e1df355b-c158-4578-ae10-0690aa3cf69c","Type":"ContainerDied","Data":"5a1fd4ebd57c9e07c464391d4886aec8652f6fd105ca10e161e115484184e6b7"} Dec 08 20:30:04 crc kubenswrapper[4781]: I1208 20:30:04.426645 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a1fd4ebd57c9e07c464391d4886aec8652f6fd105ca10e161e115484184e6b7" Dec 08 20:30:04 crc kubenswrapper[4781]: I1208 20:30:04.426292 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb" Dec 08 20:30:11 crc kubenswrapper[4781]: I1208 20:30:11.454013 4781 scope.go:117] "RemoveContainer" containerID="4002d2d62cc373bdd4cdd8a9a775ab3646cad2b3bf679696b00bf99a3fe6f6ec" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.473910 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p88x2"] Dec 08 20:30:32 crc kubenswrapper[4781]: E1208 20:30:32.475231 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1df355b-c158-4578-ae10-0690aa3cf69c" containerName="collect-profiles" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.475257 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1df355b-c158-4578-ae10-0690aa3cf69c" containerName="collect-profiles" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.475532 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1df355b-c158-4578-ae10-0690aa3cf69c" containerName="collect-profiles" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.477445 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.486592 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p88x2"] Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.626053 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks7v9\" (UniqueName: \"kubernetes.io/projected/26b2303f-6ec6-4651-aa9f-3c1933aed43a-kube-api-access-ks7v9\") pod \"community-operators-p88x2\" (UID: \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\") " pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.626520 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b2303f-6ec6-4651-aa9f-3c1933aed43a-catalog-content\") pod \"community-operators-p88x2\" (UID: \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\") " pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.626658 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b2303f-6ec6-4651-aa9f-3c1933aed43a-utilities\") pod \"community-operators-p88x2\" (UID: \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\") " pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.728289 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b2303f-6ec6-4651-aa9f-3c1933aed43a-catalog-content\") pod \"community-operators-p88x2\" (UID: \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\") " pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.728564 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b2303f-6ec6-4651-aa9f-3c1933aed43a-utilities\") pod \"community-operators-p88x2\" (UID: \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\") " pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.728598 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks7v9\" (UniqueName: \"kubernetes.io/projected/26b2303f-6ec6-4651-aa9f-3c1933aed43a-kube-api-access-ks7v9\") pod \"community-operators-p88x2\" (UID: \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\") " pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.728887 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b2303f-6ec6-4651-aa9f-3c1933aed43a-catalog-content\") pod \"community-operators-p88x2\" (UID: \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\") " pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.729236 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b2303f-6ec6-4651-aa9f-3c1933aed43a-utilities\") pod \"community-operators-p88x2\" (UID: \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\") " pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.752066 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks7v9\" (UniqueName: \"kubernetes.io/projected/26b2303f-6ec6-4651-aa9f-3c1933aed43a-kube-api-access-ks7v9\") pod \"community-operators-p88x2\" (UID: \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\") " pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:32 crc kubenswrapper[4781]: I1208 20:30:32.797209 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:33 crc kubenswrapper[4781]: I1208 20:30:33.337164 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p88x2"] Dec 08 20:30:33 crc kubenswrapper[4781]: I1208 20:30:33.365928 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p88x2" event={"ID":"26b2303f-6ec6-4651-aa9f-3c1933aed43a","Type":"ContainerStarted","Data":"3419860ea1ba0574a1fe0605a82f1413a9dc455cb90147bf8e18a5fda4076e08"} Dec 08 20:30:34 crc kubenswrapper[4781]: I1208 20:30:34.379514 4781 generic.go:334] "Generic (PLEG): container finished" podID="26b2303f-6ec6-4651-aa9f-3c1933aed43a" containerID="c0b73e0b2ed41a5028524b3a7468b8274ced9e48ab382a8637b0abdff9d1765d" exitCode=0 Dec 08 20:30:34 crc kubenswrapper[4781]: I1208 20:30:34.379589 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p88x2" event={"ID":"26b2303f-6ec6-4651-aa9f-3c1933aed43a","Type":"ContainerDied","Data":"c0b73e0b2ed41a5028524b3a7468b8274ced9e48ab382a8637b0abdff9d1765d"} Dec 08 20:30:35 crc kubenswrapper[4781]: I1208 20:30:35.397870 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p88x2" event={"ID":"26b2303f-6ec6-4651-aa9f-3c1933aed43a","Type":"ContainerStarted","Data":"86552700ad37fe070e01b44639c3587e675a04391ca13190a06ffae4d98679e8"} Dec 08 20:30:36 crc kubenswrapper[4781]: I1208 20:30:36.407849 4781 generic.go:334] "Generic (PLEG): container finished" podID="26b2303f-6ec6-4651-aa9f-3c1933aed43a" containerID="86552700ad37fe070e01b44639c3587e675a04391ca13190a06ffae4d98679e8" exitCode=0 Dec 08 20:30:36 crc kubenswrapper[4781]: I1208 20:30:36.407948 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p88x2" event={"ID":"26b2303f-6ec6-4651-aa9f-3c1933aed43a","Type":"ContainerDied","Data":"86552700ad37fe070e01b44639c3587e675a04391ca13190a06ffae4d98679e8"} Dec 08 20:30:37 crc kubenswrapper[4781]: I1208 20:30:37.422467 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p88x2" event={"ID":"26b2303f-6ec6-4651-aa9f-3c1933aed43a","Type":"ContainerStarted","Data":"1250e24bd6adbbe39bc1ae642b4d4a9b8e17952b74219d7e87061ff44610d57f"} Dec 08 20:30:37 crc kubenswrapper[4781]: I1208 20:30:37.469429 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p88x2" podStartSLOduration=3.019441859 podStartE2EDuration="5.469411605s" podCreationTimestamp="2025-12-08 20:30:32 +0000 UTC" firstStartedPulling="2025-12-08 20:30:34.382386236 +0000 UTC m=+1550.533669613" lastFinishedPulling="2025-12-08 20:30:36.832355982 +0000 UTC m=+1552.983639359" observedRunningTime="2025-12-08 20:30:37.463504725 +0000 UTC m=+1553.614788102" watchObservedRunningTime="2025-12-08 20:30:37.469411605 +0000 UTC m=+1553.620694982" Dec 08 20:30:42 crc kubenswrapper[4781]: I1208 20:30:42.798181 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:42 crc kubenswrapper[4781]: I1208 20:30:42.798613 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:42 crc kubenswrapper[4781]: I1208 20:30:42.848971 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:43 crc kubenswrapper[4781]: I1208 20:30:43.553150 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:43 crc kubenswrapper[4781]: I1208 20:30:43.615260 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p88x2"] Dec 08 20:30:45 crc kubenswrapper[4781]: I1208 20:30:45.497736 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p88x2" podUID="26b2303f-6ec6-4651-aa9f-3c1933aed43a" containerName="registry-server" containerID="cri-o://1250e24bd6adbbe39bc1ae642b4d4a9b8e17952b74219d7e87061ff44610d57f" gracePeriod=2 Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.009170 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.191103 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b2303f-6ec6-4651-aa9f-3c1933aed43a-utilities\") pod \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\" (UID: \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\") " Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.191510 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks7v9\" (UniqueName: \"kubernetes.io/projected/26b2303f-6ec6-4651-aa9f-3c1933aed43a-kube-api-access-ks7v9\") pod \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\" (UID: \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\") " Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.191685 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b2303f-6ec6-4651-aa9f-3c1933aed43a-catalog-content\") pod \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\" (UID: \"26b2303f-6ec6-4651-aa9f-3c1933aed43a\") " Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.192243 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26b2303f-6ec6-4651-aa9f-3c1933aed43a-utilities" (OuterVolumeSpecName: "utilities") pod "26b2303f-6ec6-4651-aa9f-3c1933aed43a" (UID: "26b2303f-6ec6-4651-aa9f-3c1933aed43a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.198197 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26b2303f-6ec6-4651-aa9f-3c1933aed43a-kube-api-access-ks7v9" (OuterVolumeSpecName: "kube-api-access-ks7v9") pod "26b2303f-6ec6-4651-aa9f-3c1933aed43a" (UID: "26b2303f-6ec6-4651-aa9f-3c1933aed43a"). InnerVolumeSpecName "kube-api-access-ks7v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.253891 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26b2303f-6ec6-4651-aa9f-3c1933aed43a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26b2303f-6ec6-4651-aa9f-3c1933aed43a" (UID: "26b2303f-6ec6-4651-aa9f-3c1933aed43a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.293790 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b2303f-6ec6-4651-aa9f-3c1933aed43a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.293840 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b2303f-6ec6-4651-aa9f-3c1933aed43a-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.293857 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks7v9\" (UniqueName: \"kubernetes.io/projected/26b2303f-6ec6-4651-aa9f-3c1933aed43a-kube-api-access-ks7v9\") on node \"crc\" DevicePath \"\"" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.511426 4781 generic.go:334] "Generic (PLEG): container finished" podID="26b2303f-6ec6-4651-aa9f-3c1933aed43a" containerID="1250e24bd6adbbe39bc1ae642b4d4a9b8e17952b74219d7e87061ff44610d57f" exitCode=0 Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.511505 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p88x2" event={"ID":"26b2303f-6ec6-4651-aa9f-3c1933aed43a","Type":"ContainerDied","Data":"1250e24bd6adbbe39bc1ae642b4d4a9b8e17952b74219d7e87061ff44610d57f"} Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.511814 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p88x2" event={"ID":"26b2303f-6ec6-4651-aa9f-3c1933aed43a","Type":"ContainerDied","Data":"3419860ea1ba0574a1fe0605a82f1413a9dc455cb90147bf8e18a5fda4076e08"} Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.511837 4781 scope.go:117] "RemoveContainer" containerID="1250e24bd6adbbe39bc1ae642b4d4a9b8e17952b74219d7e87061ff44610d57f" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.511547 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p88x2" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.542845 4781 scope.go:117] "RemoveContainer" containerID="86552700ad37fe070e01b44639c3587e675a04391ca13190a06ffae4d98679e8" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.549276 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p88x2"] Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.568280 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p88x2"] Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.574463 4781 scope.go:117] "RemoveContainer" containerID="c0b73e0b2ed41a5028524b3a7468b8274ced9e48ab382a8637b0abdff9d1765d" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.607960 4781 scope.go:117] "RemoveContainer" containerID="1250e24bd6adbbe39bc1ae642b4d4a9b8e17952b74219d7e87061ff44610d57f" Dec 08 20:30:46 crc kubenswrapper[4781]: E1208 20:30:46.608600 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1250e24bd6adbbe39bc1ae642b4d4a9b8e17952b74219d7e87061ff44610d57f\": container with ID starting with 1250e24bd6adbbe39bc1ae642b4d4a9b8e17952b74219d7e87061ff44610d57f not found: ID does not exist" containerID="1250e24bd6adbbe39bc1ae642b4d4a9b8e17952b74219d7e87061ff44610d57f" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.608649 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1250e24bd6adbbe39bc1ae642b4d4a9b8e17952b74219d7e87061ff44610d57f"} err="failed to get container status \"1250e24bd6adbbe39bc1ae642b4d4a9b8e17952b74219d7e87061ff44610d57f\": rpc error: code = NotFound desc = could not find container \"1250e24bd6adbbe39bc1ae642b4d4a9b8e17952b74219d7e87061ff44610d57f\": container with ID starting with 1250e24bd6adbbe39bc1ae642b4d4a9b8e17952b74219d7e87061ff44610d57f not found: ID does not exist" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.608673 4781 scope.go:117] "RemoveContainer" containerID="86552700ad37fe070e01b44639c3587e675a04391ca13190a06ffae4d98679e8" Dec 08 20:30:46 crc kubenswrapper[4781]: E1208 20:30:46.609066 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86552700ad37fe070e01b44639c3587e675a04391ca13190a06ffae4d98679e8\": container with ID starting with 86552700ad37fe070e01b44639c3587e675a04391ca13190a06ffae4d98679e8 not found: ID does not exist" containerID="86552700ad37fe070e01b44639c3587e675a04391ca13190a06ffae4d98679e8" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.609105 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86552700ad37fe070e01b44639c3587e675a04391ca13190a06ffae4d98679e8"} err="failed to get container status \"86552700ad37fe070e01b44639c3587e675a04391ca13190a06ffae4d98679e8\": rpc error: code = NotFound desc = could not find container \"86552700ad37fe070e01b44639c3587e675a04391ca13190a06ffae4d98679e8\": container with ID starting with 86552700ad37fe070e01b44639c3587e675a04391ca13190a06ffae4d98679e8 not found: ID does not exist" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.609129 4781 scope.go:117] "RemoveContainer" containerID="c0b73e0b2ed41a5028524b3a7468b8274ced9e48ab382a8637b0abdff9d1765d" Dec 08 20:30:46 crc kubenswrapper[4781]: E1208 20:30:46.609576 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0b73e0b2ed41a5028524b3a7468b8274ced9e48ab382a8637b0abdff9d1765d\": container with ID starting with c0b73e0b2ed41a5028524b3a7468b8274ced9e48ab382a8637b0abdff9d1765d not found: ID does not exist" containerID="c0b73e0b2ed41a5028524b3a7468b8274ced9e48ab382a8637b0abdff9d1765d" Dec 08 20:30:46 crc kubenswrapper[4781]: I1208 20:30:46.609629 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0b73e0b2ed41a5028524b3a7468b8274ced9e48ab382a8637b0abdff9d1765d"} err="failed to get container status \"c0b73e0b2ed41a5028524b3a7468b8274ced9e48ab382a8637b0abdff9d1765d\": rpc error: code = NotFound desc = could not find container \"c0b73e0b2ed41a5028524b3a7468b8274ced9e48ab382a8637b0abdff9d1765d\": container with ID starting with c0b73e0b2ed41a5028524b3a7468b8274ced9e48ab382a8637b0abdff9d1765d not found: ID does not exist" Dec 08 20:30:48 crc kubenswrapper[4781]: I1208 20:30:48.137106 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26b2303f-6ec6-4651-aa9f-3c1933aed43a" path="/var/lib/kubelet/pods/26b2303f-6ec6-4651-aa9f-3c1933aed43a/volumes" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.593930 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hxch4"] Dec 08 20:31:17 crc kubenswrapper[4781]: E1208 20:31:17.594802 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b2303f-6ec6-4651-aa9f-3c1933aed43a" containerName="registry-server" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.594815 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b2303f-6ec6-4651-aa9f-3c1933aed43a" containerName="registry-server" Dec 08 20:31:17 crc kubenswrapper[4781]: E1208 20:31:17.594844 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b2303f-6ec6-4651-aa9f-3c1933aed43a" containerName="extract-utilities" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.594850 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b2303f-6ec6-4651-aa9f-3c1933aed43a" containerName="extract-utilities" Dec 08 20:31:17 crc kubenswrapper[4781]: E1208 20:31:17.594871 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b2303f-6ec6-4651-aa9f-3c1933aed43a" containerName="extract-content" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.594877 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b2303f-6ec6-4651-aa9f-3c1933aed43a" containerName="extract-content" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.595071 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b2303f-6ec6-4651-aa9f-3c1933aed43a" containerName="registry-server" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.596534 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.612548 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxch4"] Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.687912 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e359a8-a156-4b3d-bd9a-715d67df008c-utilities\") pod \"redhat-marketplace-hxch4\" (UID: \"67e359a8-a156-4b3d-bd9a-715d67df008c\") " pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.688183 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e359a8-a156-4b3d-bd9a-715d67df008c-catalog-content\") pod \"redhat-marketplace-hxch4\" (UID: \"67e359a8-a156-4b3d-bd9a-715d67df008c\") " pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.688361 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn5zs\" (UniqueName: \"kubernetes.io/projected/67e359a8-a156-4b3d-bd9a-715d67df008c-kube-api-access-cn5zs\") pod \"redhat-marketplace-hxch4\" (UID: \"67e359a8-a156-4b3d-bd9a-715d67df008c\") " pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.790021 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cn5zs\" (UniqueName: \"kubernetes.io/projected/67e359a8-a156-4b3d-bd9a-715d67df008c-kube-api-access-cn5zs\") pod \"redhat-marketplace-hxch4\" (UID: \"67e359a8-a156-4b3d-bd9a-715d67df008c\") " pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.790164 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e359a8-a156-4b3d-bd9a-715d67df008c-utilities\") pod \"redhat-marketplace-hxch4\" (UID: \"67e359a8-a156-4b3d-bd9a-715d67df008c\") " pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.790196 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e359a8-a156-4b3d-bd9a-715d67df008c-catalog-content\") pod \"redhat-marketplace-hxch4\" (UID: \"67e359a8-a156-4b3d-bd9a-715d67df008c\") " pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.790757 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e359a8-a156-4b3d-bd9a-715d67df008c-utilities\") pod \"redhat-marketplace-hxch4\" (UID: \"67e359a8-a156-4b3d-bd9a-715d67df008c\") " pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.790832 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e359a8-a156-4b3d-bd9a-715d67df008c-catalog-content\") pod \"redhat-marketplace-hxch4\" (UID: \"67e359a8-a156-4b3d-bd9a-715d67df008c\") " pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.822974 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn5zs\" (UniqueName: \"kubernetes.io/projected/67e359a8-a156-4b3d-bd9a-715d67df008c-kube-api-access-cn5zs\") pod \"redhat-marketplace-hxch4\" (UID: \"67e359a8-a156-4b3d-bd9a-715d67df008c\") " pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:17 crc kubenswrapper[4781]: I1208 20:31:17.920971 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:18 crc kubenswrapper[4781]: I1208 20:31:18.377983 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxch4"] Dec 08 20:31:18 crc kubenswrapper[4781]: I1208 20:31:18.811338 4781 generic.go:334] "Generic (PLEG): container finished" podID="67e359a8-a156-4b3d-bd9a-715d67df008c" containerID="eaab2e481532b976a63923fd5d94c3507229dff0a3efad1b3133c619ec6a85e5" exitCode=0 Dec 08 20:31:18 crc kubenswrapper[4781]: I1208 20:31:18.811405 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxch4" event={"ID":"67e359a8-a156-4b3d-bd9a-715d67df008c","Type":"ContainerDied","Data":"eaab2e481532b976a63923fd5d94c3507229dff0a3efad1b3133c619ec6a85e5"} Dec 08 20:31:18 crc kubenswrapper[4781]: I1208 20:31:18.811634 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxch4" event={"ID":"67e359a8-a156-4b3d-bd9a-715d67df008c","Type":"ContainerStarted","Data":"dabb49de6eb40734a71fd61600b43233abb195c15c6226edf4a75de5972bd87a"} Dec 08 20:31:18 crc kubenswrapper[4781]: I1208 20:31:18.813556 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 20:31:19 crc kubenswrapper[4781]: I1208 20:31:19.821273 4781 generic.go:334] "Generic (PLEG): container finished" podID="67e359a8-a156-4b3d-bd9a-715d67df008c" containerID="ae2dd7ae94648af0b40524524aa473aa2d792a024297fcac7da319ce3d4f2301" exitCode=0 Dec 08 20:31:19 crc kubenswrapper[4781]: I1208 20:31:19.821348 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxch4" event={"ID":"67e359a8-a156-4b3d-bd9a-715d67df008c","Type":"ContainerDied","Data":"ae2dd7ae94648af0b40524524aa473aa2d792a024297fcac7da319ce3d4f2301"} Dec 08 20:31:20 crc kubenswrapper[4781]: I1208 20:31:20.832681 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxch4" event={"ID":"67e359a8-a156-4b3d-bd9a-715d67df008c","Type":"ContainerStarted","Data":"8590e7a11c806db5957dc62b08ce97e4d1be66b7ad43f14f4b69fb4e36c18664"} Dec 08 20:31:20 crc kubenswrapper[4781]: I1208 20:31:20.854076 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hxch4" podStartSLOduration=2.421355664 podStartE2EDuration="3.854054989s" podCreationTimestamp="2025-12-08 20:31:17 +0000 UTC" firstStartedPulling="2025-12-08 20:31:18.813337821 +0000 UTC m=+1594.964621198" lastFinishedPulling="2025-12-08 20:31:20.246037126 +0000 UTC m=+1596.397320523" observedRunningTime="2025-12-08 20:31:20.848080547 +0000 UTC m=+1596.999363944" watchObservedRunningTime="2025-12-08 20:31:20.854054989 +0000 UTC m=+1597.005338366" Dec 08 20:31:27 crc kubenswrapper[4781]: I1208 20:31:27.921233 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:27 crc kubenswrapper[4781]: I1208 20:31:27.922073 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:27 crc kubenswrapper[4781]: I1208 20:31:27.972629 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:28 crc kubenswrapper[4781]: I1208 20:31:28.952903 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:29 crc kubenswrapper[4781]: I1208 20:31:29.001022 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxch4"] Dec 08 20:31:30 crc kubenswrapper[4781]: I1208 20:31:30.920542 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hxch4" podUID="67e359a8-a156-4b3d-bd9a-715d67df008c" containerName="registry-server" containerID="cri-o://8590e7a11c806db5957dc62b08ce97e4d1be66b7ad43f14f4b69fb4e36c18664" gracePeriod=2 Dec 08 20:31:31 crc kubenswrapper[4781]: I1208 20:31:31.871419 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:31 crc kubenswrapper[4781]: I1208 20:31:31.944727 4781 generic.go:334] "Generic (PLEG): container finished" podID="67e359a8-a156-4b3d-bd9a-715d67df008c" containerID="8590e7a11c806db5957dc62b08ce97e4d1be66b7ad43f14f4b69fb4e36c18664" exitCode=0 Dec 08 20:31:31 crc kubenswrapper[4781]: I1208 20:31:31.944772 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxch4" event={"ID":"67e359a8-a156-4b3d-bd9a-715d67df008c","Type":"ContainerDied","Data":"8590e7a11c806db5957dc62b08ce97e4d1be66b7ad43f14f4b69fb4e36c18664"} Dec 08 20:31:31 crc kubenswrapper[4781]: I1208 20:31:31.944797 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hxch4" event={"ID":"67e359a8-a156-4b3d-bd9a-715d67df008c","Type":"ContainerDied","Data":"dabb49de6eb40734a71fd61600b43233abb195c15c6226edf4a75de5972bd87a"} Dec 08 20:31:31 crc kubenswrapper[4781]: I1208 20:31:31.944813 4781 scope.go:117] "RemoveContainer" containerID="8590e7a11c806db5957dc62b08ce97e4d1be66b7ad43f14f4b69fb4e36c18664" Dec 08 20:31:31 crc kubenswrapper[4781]: I1208 20:31:31.944986 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hxch4" Dec 08 20:31:31 crc kubenswrapper[4781]: I1208 20:31:31.972133 4781 scope.go:117] "RemoveContainer" containerID="ae2dd7ae94648af0b40524524aa473aa2d792a024297fcac7da319ce3d4f2301" Dec 08 20:31:31 crc kubenswrapper[4781]: I1208 20:31:31.992818 4781 scope.go:117] "RemoveContainer" containerID="eaab2e481532b976a63923fd5d94c3507229dff0a3efad1b3133c619ec6a85e5" Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.033584 4781 scope.go:117] "RemoveContainer" containerID="8590e7a11c806db5957dc62b08ce97e4d1be66b7ad43f14f4b69fb4e36c18664" Dec 08 20:31:32 crc kubenswrapper[4781]: E1208 20:31:32.034109 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8590e7a11c806db5957dc62b08ce97e4d1be66b7ad43f14f4b69fb4e36c18664\": container with ID starting with 8590e7a11c806db5957dc62b08ce97e4d1be66b7ad43f14f4b69fb4e36c18664 not found: ID does not exist" containerID="8590e7a11c806db5957dc62b08ce97e4d1be66b7ad43f14f4b69fb4e36c18664" Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.034148 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8590e7a11c806db5957dc62b08ce97e4d1be66b7ad43f14f4b69fb4e36c18664"} err="failed to get container status \"8590e7a11c806db5957dc62b08ce97e4d1be66b7ad43f14f4b69fb4e36c18664\": rpc error: code = NotFound desc = could not find container \"8590e7a11c806db5957dc62b08ce97e4d1be66b7ad43f14f4b69fb4e36c18664\": container with ID starting with 8590e7a11c806db5957dc62b08ce97e4d1be66b7ad43f14f4b69fb4e36c18664 not found: ID does not exist" Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.034174 4781 scope.go:117] "RemoveContainer" containerID="ae2dd7ae94648af0b40524524aa473aa2d792a024297fcac7da319ce3d4f2301" Dec 08 20:31:32 crc kubenswrapper[4781]: E1208 20:31:32.034539 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2dd7ae94648af0b40524524aa473aa2d792a024297fcac7da319ce3d4f2301\": container with ID starting with ae2dd7ae94648af0b40524524aa473aa2d792a024297fcac7da319ce3d4f2301 not found: ID does not exist" containerID="ae2dd7ae94648af0b40524524aa473aa2d792a024297fcac7da319ce3d4f2301" Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.034569 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2dd7ae94648af0b40524524aa473aa2d792a024297fcac7da319ce3d4f2301"} err="failed to get container status \"ae2dd7ae94648af0b40524524aa473aa2d792a024297fcac7da319ce3d4f2301\": rpc error: code = NotFound desc = could not find container \"ae2dd7ae94648af0b40524524aa473aa2d792a024297fcac7da319ce3d4f2301\": container with ID starting with ae2dd7ae94648af0b40524524aa473aa2d792a024297fcac7da319ce3d4f2301 not found: ID does not exist" Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.034587 4781 scope.go:117] "RemoveContainer" containerID="eaab2e481532b976a63923fd5d94c3507229dff0a3efad1b3133c619ec6a85e5" Dec 08 20:31:32 crc kubenswrapper[4781]: E1208 20:31:32.035214 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaab2e481532b976a63923fd5d94c3507229dff0a3efad1b3133c619ec6a85e5\": container with ID starting with eaab2e481532b976a63923fd5d94c3507229dff0a3efad1b3133c619ec6a85e5 not found: ID does not exist" containerID="eaab2e481532b976a63923fd5d94c3507229dff0a3efad1b3133c619ec6a85e5" Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.035236 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaab2e481532b976a63923fd5d94c3507229dff0a3efad1b3133c619ec6a85e5"} err="failed to get container status \"eaab2e481532b976a63923fd5d94c3507229dff0a3efad1b3133c619ec6a85e5\": rpc error: code = NotFound desc = could not find container \"eaab2e481532b976a63923fd5d94c3507229dff0a3efad1b3133c619ec6a85e5\": container with ID starting with eaab2e481532b976a63923fd5d94c3507229dff0a3efad1b3133c619ec6a85e5 not found: ID does not exist" Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.068371 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e359a8-a156-4b3d-bd9a-715d67df008c-utilities\") pod \"67e359a8-a156-4b3d-bd9a-715d67df008c\" (UID: \"67e359a8-a156-4b3d-bd9a-715d67df008c\") " Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.068601 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e359a8-a156-4b3d-bd9a-715d67df008c-catalog-content\") pod \"67e359a8-a156-4b3d-bd9a-715d67df008c\" (UID: \"67e359a8-a156-4b3d-bd9a-715d67df008c\") " Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.068646 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn5zs\" (UniqueName: \"kubernetes.io/projected/67e359a8-a156-4b3d-bd9a-715d67df008c-kube-api-access-cn5zs\") pod \"67e359a8-a156-4b3d-bd9a-715d67df008c\" (UID: \"67e359a8-a156-4b3d-bd9a-715d67df008c\") " Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.069860 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67e359a8-a156-4b3d-bd9a-715d67df008c-utilities" (OuterVolumeSpecName: "utilities") pod "67e359a8-a156-4b3d-bd9a-715d67df008c" (UID: "67e359a8-a156-4b3d-bd9a-715d67df008c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.075188 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e359a8-a156-4b3d-bd9a-715d67df008c-kube-api-access-cn5zs" (OuterVolumeSpecName: "kube-api-access-cn5zs") pod "67e359a8-a156-4b3d-bd9a-715d67df008c" (UID: "67e359a8-a156-4b3d-bd9a-715d67df008c"). InnerVolumeSpecName "kube-api-access-cn5zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.089531 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67e359a8-a156-4b3d-bd9a-715d67df008c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67e359a8-a156-4b3d-bd9a-715d67df008c" (UID: "67e359a8-a156-4b3d-bd9a-715d67df008c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.170882 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67e359a8-a156-4b3d-bd9a-715d67df008c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.170949 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cn5zs\" (UniqueName: \"kubernetes.io/projected/67e359a8-a156-4b3d-bd9a-715d67df008c-kube-api-access-cn5zs\") on node \"crc\" DevicePath \"\"" Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.170966 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67e359a8-a156-4b3d-bd9a-715d67df008c-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.270720 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxch4"] Dec 08 20:31:32 crc kubenswrapper[4781]: I1208 20:31:32.280337 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hxch4"] Dec 08 20:31:34 crc kubenswrapper[4781]: I1208 20:31:34.139766 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e359a8-a156-4b3d-bd9a-715d67df008c" path="/var/lib/kubelet/pods/67e359a8-a156-4b3d-bd9a-715d67df008c/volumes" Dec 08 20:31:34 crc kubenswrapper[4781]: I1208 20:31:34.976682 4781 generic.go:334] "Generic (PLEG): container finished" podID="3eca1a1d-a60c-4911-9cf8-fd8a82f9541c" containerID="823d4112e21adafdb7f6345afa8ddebadc66881e72de208a2aac384b0778b790" exitCode=0 Dec 08 20:31:34 crc kubenswrapper[4781]: I1208 20:31:34.976739 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" event={"ID":"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c","Type":"ContainerDied","Data":"823d4112e21adafdb7f6345afa8ddebadc66881e72de208a2aac384b0778b790"} Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.388907 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.553931 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-ssh-key\") pod \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.554081 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nhdq\" (UniqueName: \"kubernetes.io/projected/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-kube-api-access-5nhdq\") pod \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.554125 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-inventory\") pod \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.554154 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-bootstrap-combined-ca-bundle\") pod \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.559280 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3eca1a1d-a60c-4911-9cf8-fd8a82f9541c" (UID: "3eca1a1d-a60c-4911-9cf8-fd8a82f9541c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.559872 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-kube-api-access-5nhdq" (OuterVolumeSpecName: "kube-api-access-5nhdq") pod "3eca1a1d-a60c-4911-9cf8-fd8a82f9541c" (UID: "3eca1a1d-a60c-4911-9cf8-fd8a82f9541c"). InnerVolumeSpecName "kube-api-access-5nhdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:31:36 crc kubenswrapper[4781]: E1208 20:31:36.584484 4781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-inventory podName:3eca1a1d-a60c-4911-9cf8-fd8a82f9541c nodeName:}" failed. No retries permitted until 2025-12-08 20:31:37.084452675 +0000 UTC m=+1613.235736052 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-inventory") pod "3eca1a1d-a60c-4911-9cf8-fd8a82f9541c" (UID: "3eca1a1d-a60c-4911-9cf8-fd8a82f9541c") : error deleting /var/lib/kubelet/pods/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c/volume-subpaths: remove /var/lib/kubelet/pods/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c/volume-subpaths: no such file or directory Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.587265 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3eca1a1d-a60c-4911-9cf8-fd8a82f9541c" (UID: "3eca1a1d-a60c-4911-9cf8-fd8a82f9541c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.656335 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.656383 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nhdq\" (UniqueName: \"kubernetes.io/projected/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-kube-api-access-5nhdq\") on node \"crc\" DevicePath \"\"" Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.656398 4781 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.993233 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" event={"ID":"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c","Type":"ContainerDied","Data":"3060f77c7dfe752111dd1369fdcbc4c9b57a651c99b6d5fb1bb1a7eaed50388a"} Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.993305 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3060f77c7dfe752111dd1369fdcbc4c9b57a651c99b6d5fb1bb1a7eaed50388a" Dec 08 20:31:36 crc kubenswrapper[4781]: I1208 20:31:36.993343 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.084707 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk"] Dec 08 20:31:37 crc kubenswrapper[4781]: E1208 20:31:37.085104 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e359a8-a156-4b3d-bd9a-715d67df008c" containerName="registry-server" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.085125 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e359a8-a156-4b3d-bd9a-715d67df008c" containerName="registry-server" Dec 08 20:31:37 crc kubenswrapper[4781]: E1208 20:31:37.085148 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e359a8-a156-4b3d-bd9a-715d67df008c" containerName="extract-utilities" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.085155 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e359a8-a156-4b3d-bd9a-715d67df008c" containerName="extract-utilities" Dec 08 20:31:37 crc kubenswrapper[4781]: E1208 20:31:37.085185 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e359a8-a156-4b3d-bd9a-715d67df008c" containerName="extract-content" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.085191 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e359a8-a156-4b3d-bd9a-715d67df008c" containerName="extract-content" Dec 08 20:31:37 crc kubenswrapper[4781]: E1208 20:31:37.085202 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eca1a1d-a60c-4911-9cf8-fd8a82f9541c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.085208 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eca1a1d-a60c-4911-9cf8-fd8a82f9541c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.085402 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e359a8-a156-4b3d-bd9a-715d67df008c" containerName="registry-server" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.085422 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eca1a1d-a60c-4911-9cf8-fd8a82f9541c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.086137 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.102336 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk"] Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.163461 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-inventory\") pod \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\" (UID: \"3eca1a1d-a60c-4911-9cf8-fd8a82f9541c\") " Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.167886 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-inventory" (OuterVolumeSpecName: "inventory") pod "3eca1a1d-a60c-4911-9cf8-fd8a82f9541c" (UID: "3eca1a1d-a60c-4911-9cf8-fd8a82f9541c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.265431 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa022451-2529-456c-99bf-9c36b807312e-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk\" (UID: \"aa022451-2529-456c-99bf-9c36b807312e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.265522 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x569b\" (UniqueName: \"kubernetes.io/projected/aa022451-2529-456c-99bf-9c36b807312e-kube-api-access-x569b\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk\" (UID: \"aa022451-2529-456c-99bf-9c36b807312e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.265553 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa022451-2529-456c-99bf-9c36b807312e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk\" (UID: \"aa022451-2529-456c-99bf-9c36b807312e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.265734 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3eca1a1d-a60c-4911-9cf8-fd8a82f9541c-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.367430 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa022451-2529-456c-99bf-9c36b807312e-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk\" (UID: \"aa022451-2529-456c-99bf-9c36b807312e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.367488 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x569b\" (UniqueName: \"kubernetes.io/projected/aa022451-2529-456c-99bf-9c36b807312e-kube-api-access-x569b\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk\" (UID: \"aa022451-2529-456c-99bf-9c36b807312e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.367528 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa022451-2529-456c-99bf-9c36b807312e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk\" (UID: \"aa022451-2529-456c-99bf-9c36b807312e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.372113 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa022451-2529-456c-99bf-9c36b807312e-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk\" (UID: \"aa022451-2529-456c-99bf-9c36b807312e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.372509 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa022451-2529-456c-99bf-9c36b807312e-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk\" (UID: \"aa022451-2529-456c-99bf-9c36b807312e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.384284 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x569b\" (UniqueName: \"kubernetes.io/projected/aa022451-2529-456c-99bf-9c36b807312e-kube-api-access-x569b\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk\" (UID: \"aa022451-2529-456c-99bf-9c36b807312e\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.413554 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" Dec 08 20:31:37 crc kubenswrapper[4781]: I1208 20:31:37.951701 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk"] Dec 08 20:31:37 crc kubenswrapper[4781]: W1208 20:31:37.956127 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa022451_2529_456c_99bf_9c36b807312e.slice/crio-952adf19bb4fba0a7a7992736081e732bf3deb76e61694c3bb82639e5089e59e WatchSource:0}: Error finding container 952adf19bb4fba0a7a7992736081e732bf3deb76e61694c3bb82639e5089e59e: Status 404 returned error can't find the container with id 952adf19bb4fba0a7a7992736081e732bf3deb76e61694c3bb82639e5089e59e Dec 08 20:31:38 crc kubenswrapper[4781]: I1208 20:31:38.005833 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" event={"ID":"aa022451-2529-456c-99bf-9c36b807312e","Type":"ContainerStarted","Data":"952adf19bb4fba0a7a7992736081e732bf3deb76e61694c3bb82639e5089e59e"} Dec 08 20:31:39 crc kubenswrapper[4781]: I1208 20:31:39.015411 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" event={"ID":"aa022451-2529-456c-99bf-9c36b807312e","Type":"ContainerStarted","Data":"7b58283e8d1a9de58706377ec4d9b676f1e1869d09e772e3a9731a757ad0ce67"} Dec 08 20:31:39 crc kubenswrapper[4781]: I1208 20:31:39.031290 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" podStartSLOduration=1.23326001 podStartE2EDuration="2.031268809s" podCreationTimestamp="2025-12-08 20:31:37 +0000 UTC" firstStartedPulling="2025-12-08 20:31:37.958379873 +0000 UTC m=+1614.109663240" lastFinishedPulling="2025-12-08 20:31:38.756388642 +0000 UTC m=+1614.907672039" observedRunningTime="2025-12-08 20:31:39.029065395 +0000 UTC m=+1615.180348772" watchObservedRunningTime="2025-12-08 20:31:39.031268809 +0000 UTC m=+1615.182552186" Dec 08 20:31:59 crc kubenswrapper[4781]: I1208 20:31:59.947799 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:31:59 crc kubenswrapper[4781]: I1208 20:31:59.948341 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:32:29 crc kubenswrapper[4781]: I1208 20:32:29.947786 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:32:29 crc kubenswrapper[4781]: I1208 20:32:29.948505 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:32:34 crc kubenswrapper[4781]: I1208 20:32:34.045735 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-vhmq2"] Dec 08 20:32:34 crc kubenswrapper[4781]: I1208 20:32:34.060806 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-trdlp"] Dec 08 20:32:34 crc kubenswrapper[4781]: I1208 20:32:34.069421 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-vhmq2"] Dec 08 20:32:34 crc kubenswrapper[4781]: I1208 20:32:34.077245 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-trdlp"] Dec 08 20:32:34 crc kubenswrapper[4781]: I1208 20:32:34.085480 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5ece-account-create-update-q2l9w"] Dec 08 20:32:34 crc kubenswrapper[4781]: I1208 20:32:34.094240 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5ece-account-create-update-q2l9w"] Dec 08 20:32:34 crc kubenswrapper[4781]: I1208 20:32:34.136819 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f128781-d8dc-4990-a2b4-bbe58950f8c4" path="/var/lib/kubelet/pods/2f128781-d8dc-4990-a2b4-bbe58950f8c4/volumes" Dec 08 20:32:34 crc kubenswrapper[4781]: I1208 20:32:34.137740 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c3df1d-d4ef-4f84-96a8-a4b6b2345844" path="/var/lib/kubelet/pods/b2c3df1d-d4ef-4f84-96a8-a4b6b2345844/volumes" Dec 08 20:32:34 crc kubenswrapper[4781]: I1208 20:32:34.138534 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3fd2e65-3874-4123-98d0-38e820feb05d" path="/var/lib/kubelet/pods/d3fd2e65-3874-4123-98d0-38e820feb05d/volumes" Dec 08 20:32:35 crc kubenswrapper[4781]: I1208 20:32:35.040871 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-zxgkj"] Dec 08 20:32:35 crc kubenswrapper[4781]: I1208 20:32:35.052415 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b342-account-create-update-b5hpz"] Dec 08 20:32:35 crc kubenswrapper[4781]: I1208 20:32:35.061602 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-zxgkj"] Dec 08 20:32:35 crc kubenswrapper[4781]: I1208 20:32:35.069850 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b342-account-create-update-b5hpz"] Dec 08 20:32:35 crc kubenswrapper[4781]: I1208 20:32:35.079797 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-e0fb-account-create-update-fwgdh"] Dec 08 20:32:35 crc kubenswrapper[4781]: I1208 20:32:35.092702 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e0fb-account-create-update-fwgdh"] Dec 08 20:32:36 crc kubenswrapper[4781]: I1208 20:32:36.142366 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d9b08ae-f5f5-430b-9c0c-f54aafa4261e" path="/var/lib/kubelet/pods/8d9b08ae-f5f5-430b-9c0c-f54aafa4261e/volumes" Dec 08 20:32:36 crc kubenswrapper[4781]: I1208 20:32:36.143195 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf42e858-785c-484b-ab72-cddeddbdd145" path="/var/lib/kubelet/pods/bf42e858-785c-484b-ab72-cddeddbdd145/volumes" Dec 08 20:32:36 crc kubenswrapper[4781]: I1208 20:32:36.144011 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c90921e9-885e-436c-833a-8f02c075c898" path="/var/lib/kubelet/pods/c90921e9-885e-436c-833a-8f02c075c898/volumes" Dec 08 20:32:59 crc kubenswrapper[4781]: I1208 20:32:59.948428 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:32:59 crc kubenswrapper[4781]: I1208 20:32:59.949105 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:32:59 crc kubenswrapper[4781]: I1208 20:32:59.949164 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:32:59 crc kubenswrapper[4781]: I1208 20:32:59.950168 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 20:32:59 crc kubenswrapper[4781]: I1208 20:32:59.950257 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" gracePeriod=600 Dec 08 20:33:00 crc kubenswrapper[4781]: E1208 20:33:00.084465 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:33:00 crc kubenswrapper[4781]: I1208 20:33:00.783268 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" exitCode=0 Dec 08 20:33:00 crc kubenswrapper[4781]: I1208 20:33:00.783318 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c"} Dec 08 20:33:00 crc kubenswrapper[4781]: I1208 20:33:00.783354 4781 scope.go:117] "RemoveContainer" containerID="a9c0ebca79495ed59d45aa666bc740d910c70a10325b1eaed4b6173c141374c9" Dec 08 20:33:00 crc kubenswrapper[4781]: I1208 20:33:00.784052 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:33:00 crc kubenswrapper[4781]: E1208 20:33:00.784291 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:33:11 crc kubenswrapper[4781]: I1208 20:33:11.634291 4781 scope.go:117] "RemoveContainer" containerID="673b071febd5166e89e6fad72a1172c8e6eec2af6fa3dadd1fa5727d0a3eed5d" Dec 08 20:33:11 crc kubenswrapper[4781]: I1208 20:33:11.665341 4781 scope.go:117] "RemoveContainer" containerID="970549ce7c1ea4b18c2df6edbae4e26e00ca2f5c23efb145ac0e004ec668fe7e" Dec 08 20:33:11 crc kubenswrapper[4781]: I1208 20:33:11.723901 4781 scope.go:117] "RemoveContainer" containerID="3a3decd880070a4dddfa0a7e460979a7dd8508a5cb06adfcafcfae3a1a54751f" Dec 08 20:33:11 crc kubenswrapper[4781]: I1208 20:33:11.768504 4781 scope.go:117] "RemoveContainer" containerID="14e2b4d5b8812b13b2b7ca3b30c52c8fe7b05a83ca68dde969ee627990d16d0f" Dec 08 20:33:11 crc kubenswrapper[4781]: I1208 20:33:11.802793 4781 scope.go:117] "RemoveContainer" containerID="3f4167cbfbffa5757a1ca63f27e83180dc6295488d9338052690c3acf214cc9a" Dec 08 20:33:11 crc kubenswrapper[4781]: I1208 20:33:11.876198 4781 scope.go:117] "RemoveContainer" containerID="0f132fa717169ed1cefed093dab0a382dee9183233cacfab5249ea8c16c8f53f" Dec 08 20:33:13 crc kubenswrapper[4781]: I1208 20:33:13.126443 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:33:13 crc kubenswrapper[4781]: E1208 20:33:13.127026 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:33:18 crc kubenswrapper[4781]: I1208 20:33:18.056334 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-b9gc6"] Dec 08 20:33:18 crc kubenswrapper[4781]: I1208 20:33:18.069291 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-b9gc6"] Dec 08 20:33:18 crc kubenswrapper[4781]: I1208 20:33:18.143683 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6f1315-4ed1-4ae4-988a-81375adf148b" path="/var/lib/kubelet/pods/fa6f1315-4ed1-4ae4-988a-81375adf148b/volumes" Dec 08 20:33:21 crc kubenswrapper[4781]: I1208 20:33:21.024852 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-wbjm9"] Dec 08 20:33:21 crc kubenswrapper[4781]: I1208 20:33:21.037599 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-6b1d-account-create-update-smzwf"] Dec 08 20:33:21 crc kubenswrapper[4781]: I1208 20:33:21.048151 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-wbjm9"] Dec 08 20:33:21 crc kubenswrapper[4781]: I1208 20:33:21.056300 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-6b1d-account-create-update-smzwf"] Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.031188 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c18b-account-create-update-vmjzw"] Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.040636 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-tbszx"] Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.049505 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0c06-account-create-update-t4vwj"] Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.058164 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-k6p2r"] Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.066351 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0c06-account-create-update-t4vwj"] Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.075296 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-tbszx"] Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.086133 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c18b-account-create-update-vmjzw"] Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.094688 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-k6p2r"] Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.137151 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="104209dd-ab9e-4353-8a27-91f71e6ce510" path="/var/lib/kubelet/pods/104209dd-ab9e-4353-8a27-91f71e6ce510/volumes" Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.138373 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63df1682-24c8-47c7-9140-8ec51934bd3c" path="/var/lib/kubelet/pods/63df1682-24c8-47c7-9140-8ec51934bd3c/volumes" Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.139973 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89107c30-6f6d-45ff-a4f2-3a956e78c16c" path="/var/lib/kubelet/pods/89107c30-6f6d-45ff-a4f2-3a956e78c16c/volumes" Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.140808 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8b52eea-5a05-48e1-ad62-712af19a2e8c" path="/var/lib/kubelet/pods/a8b52eea-5a05-48e1-ad62-712af19a2e8c/volumes" Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.142300 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf" path="/var/lib/kubelet/pods/ee95e7c8-c862-4b21-85d7-0f0e19ddd5bf/volumes" Dec 08 20:33:22 crc kubenswrapper[4781]: I1208 20:33:22.143062 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdc65348-cbcf-4d61-86b1-3ce8584964f3" path="/var/lib/kubelet/pods/fdc65348-cbcf-4d61-86b1-3ce8584964f3/volumes" Dec 08 20:33:26 crc kubenswrapper[4781]: I1208 20:33:26.009746 4781 generic.go:334] "Generic (PLEG): container finished" podID="aa022451-2529-456c-99bf-9c36b807312e" containerID="7b58283e8d1a9de58706377ec4d9b676f1e1869d09e772e3a9731a757ad0ce67" exitCode=0 Dec 08 20:33:26 crc kubenswrapper[4781]: I1208 20:33:26.009858 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" event={"ID":"aa022451-2529-456c-99bf-9c36b807312e","Type":"ContainerDied","Data":"7b58283e8d1a9de58706377ec4d9b676f1e1869d09e772e3a9731a757ad0ce67"} Dec 08 20:33:27 crc kubenswrapper[4781]: I1208 20:33:27.127080 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:33:27 crc kubenswrapper[4781]: E1208 20:33:27.127623 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:33:27 crc kubenswrapper[4781]: I1208 20:33:27.532703 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" Dec 08 20:33:27 crc kubenswrapper[4781]: I1208 20:33:27.649239 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa022451-2529-456c-99bf-9c36b807312e-ssh-key\") pod \"aa022451-2529-456c-99bf-9c36b807312e\" (UID: \"aa022451-2529-456c-99bf-9c36b807312e\") " Dec 08 20:33:27 crc kubenswrapper[4781]: I1208 20:33:27.649391 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x569b\" (UniqueName: \"kubernetes.io/projected/aa022451-2529-456c-99bf-9c36b807312e-kube-api-access-x569b\") pod \"aa022451-2529-456c-99bf-9c36b807312e\" (UID: \"aa022451-2529-456c-99bf-9c36b807312e\") " Dec 08 20:33:27 crc kubenswrapper[4781]: I1208 20:33:27.649526 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa022451-2529-456c-99bf-9c36b807312e-inventory\") pod \"aa022451-2529-456c-99bf-9c36b807312e\" (UID: \"aa022451-2529-456c-99bf-9c36b807312e\") " Dec 08 20:33:27 crc kubenswrapper[4781]: I1208 20:33:27.656352 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa022451-2529-456c-99bf-9c36b807312e-kube-api-access-x569b" (OuterVolumeSpecName: "kube-api-access-x569b") pod "aa022451-2529-456c-99bf-9c36b807312e" (UID: "aa022451-2529-456c-99bf-9c36b807312e"). InnerVolumeSpecName "kube-api-access-x569b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:33:27 crc kubenswrapper[4781]: I1208 20:33:27.681639 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa022451-2529-456c-99bf-9c36b807312e-inventory" (OuterVolumeSpecName: "inventory") pod "aa022451-2529-456c-99bf-9c36b807312e" (UID: "aa022451-2529-456c-99bf-9c36b807312e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:33:27 crc kubenswrapper[4781]: I1208 20:33:27.683539 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa022451-2529-456c-99bf-9c36b807312e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aa022451-2529-456c-99bf-9c36b807312e" (UID: "aa022451-2529-456c-99bf-9c36b807312e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:33:27 crc kubenswrapper[4781]: I1208 20:33:27.751813 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa022451-2529-456c-99bf-9c36b807312e-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:33:27 crc kubenswrapper[4781]: I1208 20:33:27.751843 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa022451-2529-456c-99bf-9c36b807312e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:33:27 crc kubenswrapper[4781]: I1208 20:33:27.751853 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x569b\" (UniqueName: \"kubernetes.io/projected/aa022451-2529-456c-99bf-9c36b807312e-kube-api-access-x569b\") on node \"crc\" DevicePath \"\"" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.028873 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" event={"ID":"aa022451-2529-456c-99bf-9c36b807312e","Type":"ContainerDied","Data":"952adf19bb4fba0a7a7992736081e732bf3deb76e61694c3bb82639e5089e59e"} Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.028910 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952adf19bb4fba0a7a7992736081e732bf3deb76e61694c3bb82639e5089e59e" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.028965 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.142503 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d"] Dec 08 20:33:28 crc kubenswrapper[4781]: E1208 20:33:28.142831 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa022451-2529-456c-99bf-9c36b807312e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.142844 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa022451-2529-456c-99bf-9c36b807312e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.143040 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa022451-2529-456c-99bf-9c36b807312e" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.143664 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.148619 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.148851 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.149161 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.149320 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.153642 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d"] Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.291658 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b174f924-296a-45a3-b80b-fdec0f219fa8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d\" (UID: \"b174f924-296a-45a3-b80b-fdec0f219fa8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.291889 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b174f924-296a-45a3-b80b-fdec0f219fa8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d\" (UID: \"b174f924-296a-45a3-b80b-fdec0f219fa8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.292060 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxtx7\" (UniqueName: \"kubernetes.io/projected/b174f924-296a-45a3-b80b-fdec0f219fa8-kube-api-access-sxtx7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d\" (UID: \"b174f924-296a-45a3-b80b-fdec0f219fa8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.393335 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b174f924-296a-45a3-b80b-fdec0f219fa8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d\" (UID: \"b174f924-296a-45a3-b80b-fdec0f219fa8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.393425 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxtx7\" (UniqueName: \"kubernetes.io/projected/b174f924-296a-45a3-b80b-fdec0f219fa8-kube-api-access-sxtx7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d\" (UID: \"b174f924-296a-45a3-b80b-fdec0f219fa8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.393480 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b174f924-296a-45a3-b80b-fdec0f219fa8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d\" (UID: \"b174f924-296a-45a3-b80b-fdec0f219fa8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.397359 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b174f924-296a-45a3-b80b-fdec0f219fa8-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d\" (UID: \"b174f924-296a-45a3-b80b-fdec0f219fa8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.398356 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b174f924-296a-45a3-b80b-fdec0f219fa8-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d\" (UID: \"b174f924-296a-45a3-b80b-fdec0f219fa8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.409520 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxtx7\" (UniqueName: \"kubernetes.io/projected/b174f924-296a-45a3-b80b-fdec0f219fa8-kube-api-access-sxtx7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d\" (UID: \"b174f924-296a-45a3-b80b-fdec0f219fa8\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.460302 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" Dec 08 20:33:28 crc kubenswrapper[4781]: I1208 20:33:28.973277 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d"] Dec 08 20:33:29 crc kubenswrapper[4781]: I1208 20:33:29.034906 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-9r6k2"] Dec 08 20:33:29 crc kubenswrapper[4781]: I1208 20:33:29.042959 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-9r6k2"] Dec 08 20:33:29 crc kubenswrapper[4781]: I1208 20:33:29.044794 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" event={"ID":"b174f924-296a-45a3-b80b-fdec0f219fa8","Type":"ContainerStarted","Data":"5864c75e0ebc44c0ee8bc4bd0b28f28599eccb7b6f72f3eba79215070029d36b"} Dec 08 20:33:30 crc kubenswrapper[4781]: I1208 20:33:30.055619 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" event={"ID":"b174f924-296a-45a3-b80b-fdec0f219fa8","Type":"ContainerStarted","Data":"b6a528f2004729e73e0b68619613046da8819f7e736e8a848aaca538d3c87de7"} Dec 08 20:33:30 crc kubenswrapper[4781]: I1208 20:33:30.072274 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" podStartSLOduration=1.505177705 podStartE2EDuration="2.072255397s" podCreationTimestamp="2025-12-08 20:33:28 +0000 UTC" firstStartedPulling="2025-12-08 20:33:28.983308993 +0000 UTC m=+1725.134592370" lastFinishedPulling="2025-12-08 20:33:29.550386685 +0000 UTC m=+1725.701670062" observedRunningTime="2025-12-08 20:33:30.070812435 +0000 UTC m=+1726.222095812" watchObservedRunningTime="2025-12-08 20:33:30.072255397 +0000 UTC m=+1726.223538774" Dec 08 20:33:30 crc kubenswrapper[4781]: I1208 20:33:30.137751 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2759aaa8-f901-4a75-9341-5defd0024b8e" path="/var/lib/kubelet/pods/2759aaa8-f901-4a75-9341-5defd0024b8e/volumes" Dec 08 20:33:38 crc kubenswrapper[4781]: I1208 20:33:38.125601 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:33:38 crc kubenswrapper[4781]: E1208 20:33:38.126368 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:33:51 crc kubenswrapper[4781]: I1208 20:33:51.126218 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:33:51 crc kubenswrapper[4781]: E1208 20:33:51.127250 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:33:58 crc kubenswrapper[4781]: I1208 20:33:58.046440 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-5ls9p"] Dec 08 20:33:58 crc kubenswrapper[4781]: I1208 20:33:58.054603 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-5ls9p"] Dec 08 20:33:58 crc kubenswrapper[4781]: I1208 20:33:58.137557 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac6c7297-a2fe-4569-b0a2-a0a2329df115" path="/var/lib/kubelet/pods/ac6c7297-a2fe-4569-b0a2-a0a2329df115/volumes" Dec 08 20:34:05 crc kubenswrapper[4781]: I1208 20:34:05.047296 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-bnfg8"] Dec 08 20:34:05 crc kubenswrapper[4781]: I1208 20:34:05.059446 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-bnfg8"] Dec 08 20:34:06 crc kubenswrapper[4781]: I1208 20:34:06.126408 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:34:06 crc kubenswrapper[4781]: E1208 20:34:06.126757 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:34:06 crc kubenswrapper[4781]: I1208 20:34:06.136936 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4245549-33f8-4a0e-a17e-08417bed869c" path="/var/lib/kubelet/pods/c4245549-33f8-4a0e-a17e-08417bed869c/volumes" Dec 08 20:34:07 crc kubenswrapper[4781]: I1208 20:34:07.030730 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-p8pdd"] Dec 08 20:34:07 crc kubenswrapper[4781]: I1208 20:34:07.046005 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-p8pdd"] Dec 08 20:34:08 crc kubenswrapper[4781]: I1208 20:34:08.136499 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec5cb8d0-f612-43f8-a3c2-27953b92735b" path="/var/lib/kubelet/pods/ec5cb8d0-f612-43f8-a3c2-27953b92735b/volumes" Dec 08 20:34:12 crc kubenswrapper[4781]: I1208 20:34:12.017606 4781 scope.go:117] "RemoveContainer" containerID="373afcb128a58ba2a88fcd5198d686d3ef5595ac3346557bfb459cb6960fdfaf" Dec 08 20:34:12 crc kubenswrapper[4781]: I1208 20:34:12.049947 4781 scope.go:117] "RemoveContainer" containerID="ae80acc57ad5d9145b7e1568ba9e3bb6261bae1149b3c60277353d3b449226b2" Dec 08 20:34:12 crc kubenswrapper[4781]: I1208 20:34:12.096377 4781 scope.go:117] "RemoveContainer" containerID="fb0b2141e92491eafdc77c282dba8daee8151cf52a1e3db3ca7895de26ab804a" Dec 08 20:34:12 crc kubenswrapper[4781]: I1208 20:34:12.147637 4781 scope.go:117] "RemoveContainer" containerID="14dd33fc2715cde77428d8609010f9fc2ac3ecdf41ebd2365f9ffa29ad9d235f" Dec 08 20:34:12 crc kubenswrapper[4781]: I1208 20:34:12.193764 4781 scope.go:117] "RemoveContainer" containerID="644d14bbbacdfc85c71670721a4824ba17f64b2cae83c57042d18c703ea16cf5" Dec 08 20:34:12 crc kubenswrapper[4781]: I1208 20:34:12.220896 4781 scope.go:117] "RemoveContainer" containerID="d859300ce3f58859e8c5cf261bb796ad5a7b29f7f6825fab7f30e4a129604b9e" Dec 08 20:34:12 crc kubenswrapper[4781]: I1208 20:34:12.319870 4781 scope.go:117] "RemoveContainer" containerID="05b3fdd09127636c6c1ef51e29d6eae73de8c801d3eff8f636bdf079c37fad52" Dec 08 20:34:12 crc kubenswrapper[4781]: I1208 20:34:12.363124 4781 scope.go:117] "RemoveContainer" containerID="f54fda88e581234a7d6da66f0aca5fed2f7c097886ea21329647c89526a4ae15" Dec 08 20:34:12 crc kubenswrapper[4781]: I1208 20:34:12.406267 4781 scope.go:117] "RemoveContainer" containerID="14fa401579550a4ae87b42d8db67cadeb1144d67d296365a254283ce4a0dc75f" Dec 08 20:34:12 crc kubenswrapper[4781]: I1208 20:34:12.466579 4781 scope.go:117] "RemoveContainer" containerID="699d994318ce90c6ba5dd35df8ce7edef7238aad2c15d3a77cd4c0f41e6e6e72" Dec 08 20:34:12 crc kubenswrapper[4781]: I1208 20:34:12.505826 4781 scope.go:117] "RemoveContainer" containerID="03525b46669a0990a51dae4456f5e479895c525c92f468109a0d2c80e1007ee6" Dec 08 20:34:19 crc kubenswrapper[4781]: I1208 20:34:19.126353 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:34:19 crc kubenswrapper[4781]: E1208 20:34:19.127218 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:34:22 crc kubenswrapper[4781]: I1208 20:34:22.037216 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-9wz9p"] Dec 08 20:34:22 crc kubenswrapper[4781]: I1208 20:34:22.046821 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-75vwt"] Dec 08 20:34:22 crc kubenswrapper[4781]: I1208 20:34:22.056051 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-75vwt"] Dec 08 20:34:22 crc kubenswrapper[4781]: I1208 20:34:22.064849 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-9wz9p"] Dec 08 20:34:22 crc kubenswrapper[4781]: I1208 20:34:22.135355 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="279c45d3-ece4-42b6-8968-90806c171bf9" path="/var/lib/kubelet/pods/279c45d3-ece4-42b6-8968-90806c171bf9/volumes" Dec 08 20:34:22 crc kubenswrapper[4781]: I1208 20:34:22.136271 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="499d0466-ecbe-4866-b516-3c778c16ec94" path="/var/lib/kubelet/pods/499d0466-ecbe-4866-b516-3c778c16ec94/volumes" Dec 08 20:34:32 crc kubenswrapper[4781]: I1208 20:34:32.126077 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:34:32 crc kubenswrapper[4781]: E1208 20:34:32.126883 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:34:47 crc kubenswrapper[4781]: I1208 20:34:47.127137 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:34:47 crc kubenswrapper[4781]: E1208 20:34:47.128085 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:34:51 crc kubenswrapper[4781]: I1208 20:34:51.865469 4781 generic.go:334] "Generic (PLEG): container finished" podID="b174f924-296a-45a3-b80b-fdec0f219fa8" containerID="b6a528f2004729e73e0b68619613046da8819f7e736e8a848aaca538d3c87de7" exitCode=0 Dec 08 20:34:51 crc kubenswrapper[4781]: I1208 20:34:51.865554 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" event={"ID":"b174f924-296a-45a3-b80b-fdec0f219fa8","Type":"ContainerDied","Data":"b6a528f2004729e73e0b68619613046da8819f7e736e8a848aaca538d3c87de7"} Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.250751 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.323534 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxtx7\" (UniqueName: \"kubernetes.io/projected/b174f924-296a-45a3-b80b-fdec0f219fa8-kube-api-access-sxtx7\") pod \"b174f924-296a-45a3-b80b-fdec0f219fa8\" (UID: \"b174f924-296a-45a3-b80b-fdec0f219fa8\") " Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.323632 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b174f924-296a-45a3-b80b-fdec0f219fa8-inventory\") pod \"b174f924-296a-45a3-b80b-fdec0f219fa8\" (UID: \"b174f924-296a-45a3-b80b-fdec0f219fa8\") " Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.323833 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b174f924-296a-45a3-b80b-fdec0f219fa8-ssh-key\") pod \"b174f924-296a-45a3-b80b-fdec0f219fa8\" (UID: \"b174f924-296a-45a3-b80b-fdec0f219fa8\") " Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.341930 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b174f924-296a-45a3-b80b-fdec0f219fa8-kube-api-access-sxtx7" (OuterVolumeSpecName: "kube-api-access-sxtx7") pod "b174f924-296a-45a3-b80b-fdec0f219fa8" (UID: "b174f924-296a-45a3-b80b-fdec0f219fa8"). InnerVolumeSpecName "kube-api-access-sxtx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.388136 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b174f924-296a-45a3-b80b-fdec0f219fa8-inventory" (OuterVolumeSpecName: "inventory") pod "b174f924-296a-45a3-b80b-fdec0f219fa8" (UID: "b174f924-296a-45a3-b80b-fdec0f219fa8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.426152 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b174f924-296a-45a3-b80b-fdec0f219fa8-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.426184 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxtx7\" (UniqueName: \"kubernetes.io/projected/b174f924-296a-45a3-b80b-fdec0f219fa8-kube-api-access-sxtx7\") on node \"crc\" DevicePath \"\"" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.455097 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b174f924-296a-45a3-b80b-fdec0f219fa8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b174f924-296a-45a3-b80b-fdec0f219fa8" (UID: "b174f924-296a-45a3-b80b-fdec0f219fa8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.527814 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b174f924-296a-45a3-b80b-fdec0f219fa8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.884282 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" event={"ID":"b174f924-296a-45a3-b80b-fdec0f219fa8","Type":"ContainerDied","Data":"5864c75e0ebc44c0ee8bc4bd0b28f28599eccb7b6f72f3eba79215070029d36b"} Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.884326 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.884480 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5864c75e0ebc44c0ee8bc4bd0b28f28599eccb7b6f72f3eba79215070029d36b" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.965127 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7"] Dec 08 20:34:53 crc kubenswrapper[4781]: E1208 20:34:53.965548 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b174f924-296a-45a3-b80b-fdec0f219fa8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.965569 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b174f924-296a-45a3-b80b-fdec0f219fa8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.965771 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b174f924-296a-45a3-b80b-fdec0f219fa8" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.967887 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.970021 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.970506 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.970594 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.976234 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:34:53 crc kubenswrapper[4781]: I1208 20:34:53.980530 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7"] Dec 08 20:34:54 crc kubenswrapper[4781]: I1208 20:34:54.036444 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjlwl\" (UniqueName: \"kubernetes.io/projected/e9261e8f-a212-4633-bc8d-06c952d3dc9f-kube-api-access-pjlwl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7\" (UID: \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" Dec 08 20:34:54 crc kubenswrapper[4781]: I1208 20:34:54.037091 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9261e8f-a212-4633-bc8d-06c952d3dc9f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7\" (UID: \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" Dec 08 20:34:54 crc kubenswrapper[4781]: I1208 20:34:54.037373 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9261e8f-a212-4633-bc8d-06c952d3dc9f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7\" (UID: \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" Dec 08 20:34:54 crc kubenswrapper[4781]: I1208 20:34:54.139142 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9261e8f-a212-4633-bc8d-06c952d3dc9f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7\" (UID: \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" Dec 08 20:34:54 crc kubenswrapper[4781]: I1208 20:34:54.139287 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9261e8f-a212-4633-bc8d-06c952d3dc9f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7\" (UID: \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" Dec 08 20:34:54 crc kubenswrapper[4781]: I1208 20:34:54.139379 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjlwl\" (UniqueName: \"kubernetes.io/projected/e9261e8f-a212-4633-bc8d-06c952d3dc9f-kube-api-access-pjlwl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7\" (UID: \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" Dec 08 20:34:54 crc kubenswrapper[4781]: I1208 20:34:54.143860 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9261e8f-a212-4633-bc8d-06c952d3dc9f-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7\" (UID: \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" Dec 08 20:34:54 crc kubenswrapper[4781]: I1208 20:34:54.149716 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9261e8f-a212-4633-bc8d-06c952d3dc9f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7\" (UID: \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" Dec 08 20:34:54 crc kubenswrapper[4781]: I1208 20:34:54.160331 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjlwl\" (UniqueName: \"kubernetes.io/projected/e9261e8f-a212-4633-bc8d-06c952d3dc9f-kube-api-access-pjlwl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7\" (UID: \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" Dec 08 20:34:54 crc kubenswrapper[4781]: I1208 20:34:54.286252 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" Dec 08 20:34:54 crc kubenswrapper[4781]: I1208 20:34:54.828423 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7"] Dec 08 20:34:54 crc kubenswrapper[4781]: I1208 20:34:54.911586 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" event={"ID":"e9261e8f-a212-4633-bc8d-06c952d3dc9f","Type":"ContainerStarted","Data":"85ab4763fee13e39775a2dac14dbc3fb7672ad94685b6d31ee86aa96f9de1b8e"} Dec 08 20:34:55 crc kubenswrapper[4781]: I1208 20:34:55.922222 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" event={"ID":"e9261e8f-a212-4633-bc8d-06c952d3dc9f","Type":"ContainerStarted","Data":"0a0215e9f6f26cc90a12d5e846e355493d59ad12da3c1836fb58bce14d1f1a39"} Dec 08 20:34:55 crc kubenswrapper[4781]: I1208 20:34:55.934382 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" podStartSLOduration=2.339220943 podStartE2EDuration="2.934351411s" podCreationTimestamp="2025-12-08 20:34:53 +0000 UTC" firstStartedPulling="2025-12-08 20:34:54.834304409 +0000 UTC m=+1810.985587786" lastFinishedPulling="2025-12-08 20:34:55.429434877 +0000 UTC m=+1811.580718254" observedRunningTime="2025-12-08 20:34:55.933422655 +0000 UTC m=+1812.084706042" watchObservedRunningTime="2025-12-08 20:34:55.934351411 +0000 UTC m=+1812.085634788" Dec 08 20:34:56 crc kubenswrapper[4781]: I1208 20:34:56.080912 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-dplq8"] Dec 08 20:34:56 crc kubenswrapper[4781]: I1208 20:34:56.088411 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-dplq8"] Dec 08 20:34:56 crc kubenswrapper[4781]: I1208 20:34:56.136056 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4532eac0-b6a8-4560-bb45-ba9b78cc4eb4" path="/var/lib/kubelet/pods/4532eac0-b6a8-4560-bb45-ba9b78cc4eb4/volumes" Dec 08 20:34:57 crc kubenswrapper[4781]: I1208 20:34:57.032550 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9c42-account-create-update-4r8td"] Dec 08 20:34:57 crc kubenswrapper[4781]: I1208 20:34:57.051456 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-057f-account-create-update-58gjv"] Dec 08 20:34:57 crc kubenswrapper[4781]: I1208 20:34:57.059176 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-dae9-account-create-update-jrqz4"] Dec 08 20:34:57 crc kubenswrapper[4781]: I1208 20:34:57.066348 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-xlpd4"] Dec 08 20:34:57 crc kubenswrapper[4781]: I1208 20:34:57.073725 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-dae9-account-create-update-jrqz4"] Dec 08 20:34:57 crc kubenswrapper[4781]: I1208 20:34:57.080685 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-dvfwv"] Dec 08 20:34:57 crc kubenswrapper[4781]: I1208 20:34:57.088652 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-057f-account-create-update-58gjv"] Dec 08 20:34:57 crc kubenswrapper[4781]: I1208 20:34:57.096053 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-xlpd4"] Dec 08 20:34:57 crc kubenswrapper[4781]: I1208 20:34:57.104572 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9c42-account-create-update-4r8td"] Dec 08 20:34:57 crc kubenswrapper[4781]: I1208 20:34:57.111987 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-dvfwv"] Dec 08 20:34:58 crc kubenswrapper[4781]: I1208 20:34:58.139341 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ada136d-baf8-411b-aa50-66edaf44a52b" path="/var/lib/kubelet/pods/0ada136d-baf8-411b-aa50-66edaf44a52b/volumes" Dec 08 20:34:58 crc kubenswrapper[4781]: I1208 20:34:58.140175 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12f129eb-3261-41d7-99e4-a6de53a7fd31" path="/var/lib/kubelet/pods/12f129eb-3261-41d7-99e4-a6de53a7fd31/volumes" Dec 08 20:34:58 crc kubenswrapper[4781]: I1208 20:34:58.140816 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2627247a-1d1b-4959-8b8f-e8750950b5ec" path="/var/lib/kubelet/pods/2627247a-1d1b-4959-8b8f-e8750950b5ec/volumes" Dec 08 20:34:58 crc kubenswrapper[4781]: I1208 20:34:58.141442 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63220ecf-e9cb-45c3-b7a6-ac8a631cb22c" path="/var/lib/kubelet/pods/63220ecf-e9cb-45c3-b7a6-ac8a631cb22c/volumes" Dec 08 20:34:58 crc kubenswrapper[4781]: I1208 20:34:58.142585 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afd06929-88eb-4520-a1e9-97c32d3c2223" path="/var/lib/kubelet/pods/afd06929-88eb-4520-a1e9-97c32d3c2223/volumes" Dec 08 20:35:00 crc kubenswrapper[4781]: I1208 20:35:00.126881 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:35:00 crc kubenswrapper[4781]: E1208 20:35:00.127661 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:35:01 crc kubenswrapper[4781]: I1208 20:35:01.029744 4781 generic.go:334] "Generic (PLEG): container finished" podID="e9261e8f-a212-4633-bc8d-06c952d3dc9f" containerID="0a0215e9f6f26cc90a12d5e846e355493d59ad12da3c1836fb58bce14d1f1a39" exitCode=0 Dec 08 20:35:01 crc kubenswrapper[4781]: I1208 20:35:01.029828 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" event={"ID":"e9261e8f-a212-4633-bc8d-06c952d3dc9f","Type":"ContainerDied","Data":"0a0215e9f6f26cc90a12d5e846e355493d59ad12da3c1836fb58bce14d1f1a39"} Dec 08 20:35:02 crc kubenswrapper[4781]: I1208 20:35:02.457556 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" Dec 08 20:35:02 crc kubenswrapper[4781]: I1208 20:35:02.546557 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9261e8f-a212-4633-bc8d-06c952d3dc9f-inventory\") pod \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\" (UID: \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\") " Dec 08 20:35:02 crc kubenswrapper[4781]: I1208 20:35:02.546658 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjlwl\" (UniqueName: \"kubernetes.io/projected/e9261e8f-a212-4633-bc8d-06c952d3dc9f-kube-api-access-pjlwl\") pod \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\" (UID: \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\") " Dec 08 20:35:02 crc kubenswrapper[4781]: I1208 20:35:02.546747 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9261e8f-a212-4633-bc8d-06c952d3dc9f-ssh-key\") pod \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\" (UID: \"e9261e8f-a212-4633-bc8d-06c952d3dc9f\") " Dec 08 20:35:02 crc kubenswrapper[4781]: I1208 20:35:02.555398 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9261e8f-a212-4633-bc8d-06c952d3dc9f-kube-api-access-pjlwl" (OuterVolumeSpecName: "kube-api-access-pjlwl") pod "e9261e8f-a212-4633-bc8d-06c952d3dc9f" (UID: "e9261e8f-a212-4633-bc8d-06c952d3dc9f"). InnerVolumeSpecName "kube-api-access-pjlwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:35:02 crc kubenswrapper[4781]: I1208 20:35:02.585884 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9261e8f-a212-4633-bc8d-06c952d3dc9f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e9261e8f-a212-4633-bc8d-06c952d3dc9f" (UID: "e9261e8f-a212-4633-bc8d-06c952d3dc9f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:35:02 crc kubenswrapper[4781]: I1208 20:35:02.597912 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9261e8f-a212-4633-bc8d-06c952d3dc9f-inventory" (OuterVolumeSpecName: "inventory") pod "e9261e8f-a212-4633-bc8d-06c952d3dc9f" (UID: "e9261e8f-a212-4633-bc8d-06c952d3dc9f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:35:02 crc kubenswrapper[4781]: I1208 20:35:02.648452 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9261e8f-a212-4633-bc8d-06c952d3dc9f-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:35:02 crc kubenswrapper[4781]: I1208 20:35:02.648482 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjlwl\" (UniqueName: \"kubernetes.io/projected/e9261e8f-a212-4633-bc8d-06c952d3dc9f-kube-api-access-pjlwl\") on node \"crc\" DevicePath \"\"" Dec 08 20:35:02 crc kubenswrapper[4781]: I1208 20:35:02.648491 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e9261e8f-a212-4633-bc8d-06c952d3dc9f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.058852 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" event={"ID":"e9261e8f-a212-4633-bc8d-06c952d3dc9f","Type":"ContainerDied","Data":"85ab4763fee13e39775a2dac14dbc3fb7672ad94685b6d31ee86aa96f9de1b8e"} Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.058929 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85ab4763fee13e39775a2dac14dbc3fb7672ad94685b6d31ee86aa96f9de1b8e" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.059159 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.123731 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6"] Dec 08 20:35:03 crc kubenswrapper[4781]: E1208 20:35:03.124364 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9261e8f-a212-4633-bc8d-06c952d3dc9f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.124383 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9261e8f-a212-4633-bc8d-06c952d3dc9f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.124575 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9261e8f-a212-4633-bc8d-06c952d3dc9f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.125580 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.128193 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.128800 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.133600 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6"] Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.134154 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.134257 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.220196 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7jqg\" (UniqueName: \"kubernetes.io/projected/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-kube-api-access-g7jqg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9qr6\" (UID: \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.220385 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9qr6\" (UID: \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.220497 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9qr6\" (UID: \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.321646 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7jqg\" (UniqueName: \"kubernetes.io/projected/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-kube-api-access-g7jqg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9qr6\" (UID: \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.321750 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9qr6\" (UID: \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.321791 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9qr6\" (UID: \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.336378 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9qr6\" (UID: \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.336412 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9qr6\" (UID: \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.338847 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7jqg\" (UniqueName: \"kubernetes.io/projected/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-kube-api-access-g7jqg\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-p9qr6\" (UID: \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" Dec 08 20:35:03 crc kubenswrapper[4781]: I1208 20:35:03.443613 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" Dec 08 20:35:04 crc kubenswrapper[4781]: I1208 20:35:04.090205 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6"] Dec 08 20:35:05 crc kubenswrapper[4781]: I1208 20:35:05.118483 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" event={"ID":"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a","Type":"ContainerStarted","Data":"c66c6774ac26bad482f307e7db7bf89c497cfc5246910d1b31c309e8838f7ceb"} Dec 08 20:35:05 crc kubenswrapper[4781]: I1208 20:35:05.119042 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" event={"ID":"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a","Type":"ContainerStarted","Data":"aced59dea9130c918545b0f2ac5606d2277c6bdda31e961d424ce45d9ae8050d"} Dec 08 20:35:05 crc kubenswrapper[4781]: I1208 20:35:05.144731 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" podStartSLOduration=1.61235391 podStartE2EDuration="2.144711404s" podCreationTimestamp="2025-12-08 20:35:03 +0000 UTC" firstStartedPulling="2025-12-08 20:35:04.118463033 +0000 UTC m=+1820.269746410" lastFinishedPulling="2025-12-08 20:35:04.650820527 +0000 UTC m=+1820.802103904" observedRunningTime="2025-12-08 20:35:05.136683163 +0000 UTC m=+1821.287966560" watchObservedRunningTime="2025-12-08 20:35:05.144711404 +0000 UTC m=+1821.295994781" Dec 08 20:35:10 crc kubenswrapper[4781]: I1208 20:35:10.885782 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4h98l"] Dec 08 20:35:10 crc kubenswrapper[4781]: I1208 20:35:10.889323 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:10 crc kubenswrapper[4781]: I1208 20:35:10.897410 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4h98l"] Dec 08 20:35:11 crc kubenswrapper[4781]: I1208 20:35:11.026673 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fca1339c-81f9-4ba1-ae68-da8529a5821b-utilities\") pod \"certified-operators-4h98l\" (UID: \"fca1339c-81f9-4ba1-ae68-da8529a5821b\") " pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:11 crc kubenswrapper[4781]: I1208 20:35:11.026946 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2lsd\" (UniqueName: \"kubernetes.io/projected/fca1339c-81f9-4ba1-ae68-da8529a5821b-kube-api-access-m2lsd\") pod \"certified-operators-4h98l\" (UID: \"fca1339c-81f9-4ba1-ae68-da8529a5821b\") " pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:11 crc kubenswrapper[4781]: I1208 20:35:11.027025 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fca1339c-81f9-4ba1-ae68-da8529a5821b-catalog-content\") pod \"certified-operators-4h98l\" (UID: \"fca1339c-81f9-4ba1-ae68-da8529a5821b\") " pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:11 crc kubenswrapper[4781]: I1208 20:35:11.147659 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2lsd\" (UniqueName: \"kubernetes.io/projected/fca1339c-81f9-4ba1-ae68-da8529a5821b-kube-api-access-m2lsd\") pod \"certified-operators-4h98l\" (UID: \"fca1339c-81f9-4ba1-ae68-da8529a5821b\") " pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:11 crc kubenswrapper[4781]: I1208 20:35:11.147732 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fca1339c-81f9-4ba1-ae68-da8529a5821b-catalog-content\") pod \"certified-operators-4h98l\" (UID: \"fca1339c-81f9-4ba1-ae68-da8529a5821b\") " pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:11 crc kubenswrapper[4781]: I1208 20:35:11.147836 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fca1339c-81f9-4ba1-ae68-da8529a5821b-utilities\") pod \"certified-operators-4h98l\" (UID: \"fca1339c-81f9-4ba1-ae68-da8529a5821b\") " pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:11 crc kubenswrapper[4781]: I1208 20:35:11.148316 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fca1339c-81f9-4ba1-ae68-da8529a5821b-utilities\") pod \"certified-operators-4h98l\" (UID: \"fca1339c-81f9-4ba1-ae68-da8529a5821b\") " pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:11 crc kubenswrapper[4781]: I1208 20:35:11.148604 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fca1339c-81f9-4ba1-ae68-da8529a5821b-catalog-content\") pod \"certified-operators-4h98l\" (UID: \"fca1339c-81f9-4ba1-ae68-da8529a5821b\") " pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:11 crc kubenswrapper[4781]: I1208 20:35:11.174169 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2lsd\" (UniqueName: \"kubernetes.io/projected/fca1339c-81f9-4ba1-ae68-da8529a5821b-kube-api-access-m2lsd\") pod \"certified-operators-4h98l\" (UID: \"fca1339c-81f9-4ba1-ae68-da8529a5821b\") " pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:11 crc kubenswrapper[4781]: I1208 20:35:11.218347 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:11 crc kubenswrapper[4781]: I1208 20:35:11.740817 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4h98l"] Dec 08 20:35:11 crc kubenswrapper[4781]: W1208 20:35:11.748579 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfca1339c_81f9_4ba1_ae68_da8529a5821b.slice/crio-777ae43d7013d80c3096b1b4d7a7068ebf82c122800e8c602b1191ed85369e7d WatchSource:0}: Error finding container 777ae43d7013d80c3096b1b4d7a7068ebf82c122800e8c602b1191ed85369e7d: Status 404 returned error can't find the container with id 777ae43d7013d80c3096b1b4d7a7068ebf82c122800e8c602b1191ed85369e7d Dec 08 20:35:12 crc kubenswrapper[4781]: I1208 20:35:12.187083 4781 generic.go:334] "Generic (PLEG): container finished" podID="fca1339c-81f9-4ba1-ae68-da8529a5821b" containerID="51c5d9aaa82dace36fa4598f90d22f7e81d911d7cae7c6d151305114e33053ea" exitCode=0 Dec 08 20:35:12 crc kubenswrapper[4781]: I1208 20:35:12.187228 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h98l" event={"ID":"fca1339c-81f9-4ba1-ae68-da8529a5821b","Type":"ContainerDied","Data":"51c5d9aaa82dace36fa4598f90d22f7e81d911d7cae7c6d151305114e33053ea"} Dec 08 20:35:12 crc kubenswrapper[4781]: I1208 20:35:12.187433 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h98l" event={"ID":"fca1339c-81f9-4ba1-ae68-da8529a5821b","Type":"ContainerStarted","Data":"777ae43d7013d80c3096b1b4d7a7068ebf82c122800e8c602b1191ed85369e7d"} Dec 08 20:35:12 crc kubenswrapper[4781]: I1208 20:35:12.728880 4781 scope.go:117] "RemoveContainer" containerID="7c9115503c2e7d7c25104a8e3be24dbba80b8d8ca0d1ecd56d371c61323d2096" Dec 08 20:35:12 crc kubenswrapper[4781]: I1208 20:35:12.762423 4781 scope.go:117] "RemoveContainer" containerID="c5fe51dde9ae8ae7b6ad5d42fea29b824b836a3924cd6b84fe8106110b673ef5" Dec 08 20:35:12 crc kubenswrapper[4781]: I1208 20:35:12.849736 4781 scope.go:117] "RemoveContainer" containerID="2426533ca9672cf99917ddb8d906aea5468713284691d9e893e68df66a2ec1d9" Dec 08 20:35:12 crc kubenswrapper[4781]: I1208 20:35:12.914498 4781 scope.go:117] "RemoveContainer" containerID="57bc8a3faf644b94b9e9fb2842e5d3b0e86071d2c94bf52cd4d1f16eeec5a217" Dec 08 20:35:12 crc kubenswrapper[4781]: I1208 20:35:12.982527 4781 scope.go:117] "RemoveContainer" containerID="0e108d620cb33dc584a0fa26bba94baa9648d815c3969c6c58b0e8c086d9ddab" Dec 08 20:35:13 crc kubenswrapper[4781]: I1208 20:35:13.012455 4781 scope.go:117] "RemoveContainer" containerID="2903d9d595f474a0edbe637cfe5462f526f70f335210ec3c44714dc6b4ce703d" Dec 08 20:35:13 crc kubenswrapper[4781]: I1208 20:35:13.029038 4781 scope.go:117] "RemoveContainer" containerID="112b865222beb978823f687a69cf100b9eedd1374633ddc235bb6bff1f8f404e" Dec 08 20:35:13 crc kubenswrapper[4781]: I1208 20:35:13.054037 4781 scope.go:117] "RemoveContainer" containerID="c48d3b2347b8e4b05bd1defc6558212509d9b0f36a31dce3b3c5c8cc5df204d2" Dec 08 20:35:13 crc kubenswrapper[4781]: I1208 20:35:13.197737 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h98l" event={"ID":"fca1339c-81f9-4ba1-ae68-da8529a5821b","Type":"ContainerStarted","Data":"6208011c551f31562657e7897f1a73bd60e38413565ac113df08bebf314981c4"} Dec 08 20:35:14 crc kubenswrapper[4781]: I1208 20:35:14.132538 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:35:14 crc kubenswrapper[4781]: E1208 20:35:14.133067 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:35:14 crc kubenswrapper[4781]: I1208 20:35:14.208771 4781 generic.go:334] "Generic (PLEG): container finished" podID="fca1339c-81f9-4ba1-ae68-da8529a5821b" containerID="6208011c551f31562657e7897f1a73bd60e38413565ac113df08bebf314981c4" exitCode=0 Dec 08 20:35:14 crc kubenswrapper[4781]: I1208 20:35:14.209032 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h98l" event={"ID":"fca1339c-81f9-4ba1-ae68-da8529a5821b","Type":"ContainerDied","Data":"6208011c551f31562657e7897f1a73bd60e38413565ac113df08bebf314981c4"} Dec 08 20:35:16 crc kubenswrapper[4781]: I1208 20:35:16.233582 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h98l" event={"ID":"fca1339c-81f9-4ba1-ae68-da8529a5821b","Type":"ContainerStarted","Data":"147cab5a3e4bb3f38e41755e8abc4dcab6b9a28bfa3c130cda136009fe41a023"} Dec 08 20:35:16 crc kubenswrapper[4781]: I1208 20:35:16.260127 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4h98l" podStartSLOduration=2.586200544 podStartE2EDuration="6.260101906s" podCreationTimestamp="2025-12-08 20:35:10 +0000 UTC" firstStartedPulling="2025-12-08 20:35:12.189180783 +0000 UTC m=+1828.340464160" lastFinishedPulling="2025-12-08 20:35:15.863082145 +0000 UTC m=+1832.014365522" observedRunningTime="2025-12-08 20:35:16.252012203 +0000 UTC m=+1832.403295590" watchObservedRunningTime="2025-12-08 20:35:16.260101906 +0000 UTC m=+1832.411385283" Dec 08 20:35:21 crc kubenswrapper[4781]: I1208 20:35:21.218548 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:21 crc kubenswrapper[4781]: I1208 20:35:21.219074 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:21 crc kubenswrapper[4781]: I1208 20:35:21.279797 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:21 crc kubenswrapper[4781]: I1208 20:35:21.339347 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:21 crc kubenswrapper[4781]: I1208 20:35:21.515246 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4h98l"] Dec 08 20:35:23 crc kubenswrapper[4781]: I1208 20:35:23.294641 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4h98l" podUID="fca1339c-81f9-4ba1-ae68-da8529a5821b" containerName="registry-server" containerID="cri-o://147cab5a3e4bb3f38e41755e8abc4dcab6b9a28bfa3c130cda136009fe41a023" gracePeriod=2 Dec 08 20:35:23 crc kubenswrapper[4781]: I1208 20:35:23.837304 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:23 crc kubenswrapper[4781]: I1208 20:35:23.965906 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fca1339c-81f9-4ba1-ae68-da8529a5821b-catalog-content\") pod \"fca1339c-81f9-4ba1-ae68-da8529a5821b\" (UID: \"fca1339c-81f9-4ba1-ae68-da8529a5821b\") " Dec 08 20:35:23 crc kubenswrapper[4781]: I1208 20:35:23.966445 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2lsd\" (UniqueName: \"kubernetes.io/projected/fca1339c-81f9-4ba1-ae68-da8529a5821b-kube-api-access-m2lsd\") pod \"fca1339c-81f9-4ba1-ae68-da8529a5821b\" (UID: \"fca1339c-81f9-4ba1-ae68-da8529a5821b\") " Dec 08 20:35:23 crc kubenswrapper[4781]: I1208 20:35:23.966650 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fca1339c-81f9-4ba1-ae68-da8529a5821b-utilities\") pod \"fca1339c-81f9-4ba1-ae68-da8529a5821b\" (UID: \"fca1339c-81f9-4ba1-ae68-da8529a5821b\") " Dec 08 20:35:23 crc kubenswrapper[4781]: I1208 20:35:23.967430 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fca1339c-81f9-4ba1-ae68-da8529a5821b-utilities" (OuterVolumeSpecName: "utilities") pod "fca1339c-81f9-4ba1-ae68-da8529a5821b" (UID: "fca1339c-81f9-4ba1-ae68-da8529a5821b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:35:23 crc kubenswrapper[4781]: I1208 20:35:23.974361 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fca1339c-81f9-4ba1-ae68-da8529a5821b-kube-api-access-m2lsd" (OuterVolumeSpecName: "kube-api-access-m2lsd") pod "fca1339c-81f9-4ba1-ae68-da8529a5821b" (UID: "fca1339c-81f9-4ba1-ae68-da8529a5821b"). InnerVolumeSpecName "kube-api-access-m2lsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.015049 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fca1339c-81f9-4ba1-ae68-da8529a5821b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fca1339c-81f9-4ba1-ae68-da8529a5821b" (UID: "fca1339c-81f9-4ba1-ae68-da8529a5821b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.046872 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zrfdd"] Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.058089 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zrfdd"] Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.069853 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fca1339c-81f9-4ba1-ae68-da8529a5821b-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.069889 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fca1339c-81f9-4ba1-ae68-da8529a5821b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.069902 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2lsd\" (UniqueName: \"kubernetes.io/projected/fca1339c-81f9-4ba1-ae68-da8529a5821b-kube-api-access-m2lsd\") on node \"crc\" DevicePath \"\"" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.138172 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed" path="/var/lib/kubelet/pods/38f74eb5-3cf1-4c3f-afe8-0e2aa28062ed/volumes" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.320116 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4h98l" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.320136 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h98l" event={"ID":"fca1339c-81f9-4ba1-ae68-da8529a5821b","Type":"ContainerDied","Data":"147cab5a3e4bb3f38e41755e8abc4dcab6b9a28bfa3c130cda136009fe41a023"} Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.320192 4781 scope.go:117] "RemoveContainer" containerID="147cab5a3e4bb3f38e41755e8abc4dcab6b9a28bfa3c130cda136009fe41a023" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.319912 4781 generic.go:334] "Generic (PLEG): container finished" podID="fca1339c-81f9-4ba1-ae68-da8529a5821b" containerID="147cab5a3e4bb3f38e41755e8abc4dcab6b9a28bfa3c130cda136009fe41a023" exitCode=0 Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.321052 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4h98l" event={"ID":"fca1339c-81f9-4ba1-ae68-da8529a5821b","Type":"ContainerDied","Data":"777ae43d7013d80c3096b1b4d7a7068ebf82c122800e8c602b1191ed85369e7d"} Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.355904 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4h98l"] Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.355994 4781 scope.go:117] "RemoveContainer" containerID="6208011c551f31562657e7897f1a73bd60e38413565ac113df08bebf314981c4" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.363493 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4h98l"] Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.377457 4781 scope.go:117] "RemoveContainer" containerID="51c5d9aaa82dace36fa4598f90d22f7e81d911d7cae7c6d151305114e33053ea" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.418316 4781 scope.go:117] "RemoveContainer" containerID="147cab5a3e4bb3f38e41755e8abc4dcab6b9a28bfa3c130cda136009fe41a023" Dec 08 20:35:24 crc kubenswrapper[4781]: E1208 20:35:24.418745 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"147cab5a3e4bb3f38e41755e8abc4dcab6b9a28bfa3c130cda136009fe41a023\": container with ID starting with 147cab5a3e4bb3f38e41755e8abc4dcab6b9a28bfa3c130cda136009fe41a023 not found: ID does not exist" containerID="147cab5a3e4bb3f38e41755e8abc4dcab6b9a28bfa3c130cda136009fe41a023" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.418803 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"147cab5a3e4bb3f38e41755e8abc4dcab6b9a28bfa3c130cda136009fe41a023"} err="failed to get container status \"147cab5a3e4bb3f38e41755e8abc4dcab6b9a28bfa3c130cda136009fe41a023\": rpc error: code = NotFound desc = could not find container \"147cab5a3e4bb3f38e41755e8abc4dcab6b9a28bfa3c130cda136009fe41a023\": container with ID starting with 147cab5a3e4bb3f38e41755e8abc4dcab6b9a28bfa3c130cda136009fe41a023 not found: ID does not exist" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.418836 4781 scope.go:117] "RemoveContainer" containerID="6208011c551f31562657e7897f1a73bd60e38413565ac113df08bebf314981c4" Dec 08 20:35:24 crc kubenswrapper[4781]: E1208 20:35:24.419321 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6208011c551f31562657e7897f1a73bd60e38413565ac113df08bebf314981c4\": container with ID starting with 6208011c551f31562657e7897f1a73bd60e38413565ac113df08bebf314981c4 not found: ID does not exist" containerID="6208011c551f31562657e7897f1a73bd60e38413565ac113df08bebf314981c4" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.419357 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6208011c551f31562657e7897f1a73bd60e38413565ac113df08bebf314981c4"} err="failed to get container status \"6208011c551f31562657e7897f1a73bd60e38413565ac113df08bebf314981c4\": rpc error: code = NotFound desc = could not find container \"6208011c551f31562657e7897f1a73bd60e38413565ac113df08bebf314981c4\": container with ID starting with 6208011c551f31562657e7897f1a73bd60e38413565ac113df08bebf314981c4 not found: ID does not exist" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.419379 4781 scope.go:117] "RemoveContainer" containerID="51c5d9aaa82dace36fa4598f90d22f7e81d911d7cae7c6d151305114e33053ea" Dec 08 20:35:24 crc kubenswrapper[4781]: E1208 20:35:24.419647 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c5d9aaa82dace36fa4598f90d22f7e81d911d7cae7c6d151305114e33053ea\": container with ID starting with 51c5d9aaa82dace36fa4598f90d22f7e81d911d7cae7c6d151305114e33053ea not found: ID does not exist" containerID="51c5d9aaa82dace36fa4598f90d22f7e81d911d7cae7c6d151305114e33053ea" Dec 08 20:35:24 crc kubenswrapper[4781]: I1208 20:35:24.419677 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c5d9aaa82dace36fa4598f90d22f7e81d911d7cae7c6d151305114e33053ea"} err="failed to get container status \"51c5d9aaa82dace36fa4598f90d22f7e81d911d7cae7c6d151305114e33053ea\": rpc error: code = NotFound desc = could not find container \"51c5d9aaa82dace36fa4598f90d22f7e81d911d7cae7c6d151305114e33053ea\": container with ID starting with 51c5d9aaa82dace36fa4598f90d22f7e81d911d7cae7c6d151305114e33053ea not found: ID does not exist" Dec 08 20:35:26 crc kubenswrapper[4781]: I1208 20:35:26.138614 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fca1339c-81f9-4ba1-ae68-da8529a5821b" path="/var/lib/kubelet/pods/fca1339c-81f9-4ba1-ae68-da8529a5821b/volumes" Dec 08 20:35:28 crc kubenswrapper[4781]: I1208 20:35:28.126296 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:35:28 crc kubenswrapper[4781]: E1208 20:35:28.127758 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:35:41 crc kubenswrapper[4781]: I1208 20:35:41.459714 4781 generic.go:334] "Generic (PLEG): container finished" podID="c08d5f1c-63e8-4974-b50c-29b0e8db5e9a" containerID="c66c6774ac26bad482f307e7db7bf89c497cfc5246910d1b31c309e8838f7ceb" exitCode=0 Dec 08 20:35:41 crc kubenswrapper[4781]: I1208 20:35:41.459828 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" event={"ID":"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a","Type":"ContainerDied","Data":"c66c6774ac26bad482f307e7db7bf89c497cfc5246910d1b31c309e8838f7ceb"} Dec 08 20:35:42 crc kubenswrapper[4781]: I1208 20:35:42.126847 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:35:42 crc kubenswrapper[4781]: E1208 20:35:42.127521 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:35:42 crc kubenswrapper[4781]: I1208 20:35:42.870709 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.037067 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-ssh-key\") pod \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\" (UID: \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\") " Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.037131 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7jqg\" (UniqueName: \"kubernetes.io/projected/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-kube-api-access-g7jqg\") pod \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\" (UID: \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\") " Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.037461 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-inventory\") pod \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\" (UID: \"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a\") " Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.043577 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-kube-api-access-g7jqg" (OuterVolumeSpecName: "kube-api-access-g7jqg") pod "c08d5f1c-63e8-4974-b50c-29b0e8db5e9a" (UID: "c08d5f1c-63e8-4974-b50c-29b0e8db5e9a"). InnerVolumeSpecName "kube-api-access-g7jqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.073126 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c08d5f1c-63e8-4974-b50c-29b0e8db5e9a" (UID: "c08d5f1c-63e8-4974-b50c-29b0e8db5e9a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.086952 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-inventory" (OuterVolumeSpecName: "inventory") pod "c08d5f1c-63e8-4974-b50c-29b0e8db5e9a" (UID: "c08d5f1c-63e8-4974-b50c-29b0e8db5e9a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.140211 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.140253 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.140268 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7jqg\" (UniqueName: \"kubernetes.io/projected/c08d5f1c-63e8-4974-b50c-29b0e8db5e9a-kube-api-access-g7jqg\") on node \"crc\" DevicePath \"\"" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.480314 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" event={"ID":"c08d5f1c-63e8-4974-b50c-29b0e8db5e9a","Type":"ContainerDied","Data":"aced59dea9130c918545b0f2ac5606d2277c6bdda31e961d424ce45d9ae8050d"} Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.480365 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aced59dea9130c918545b0f2ac5606d2277c6bdda31e961d424ce45d9ae8050d" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.480382 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-p9qr6" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.606854 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns"] Dec 08 20:35:43 crc kubenswrapper[4781]: E1208 20:35:43.607324 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca1339c-81f9-4ba1-ae68-da8529a5821b" containerName="extract-content" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.607345 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca1339c-81f9-4ba1-ae68-da8529a5821b" containerName="extract-content" Dec 08 20:35:43 crc kubenswrapper[4781]: E1208 20:35:43.607392 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08d5f1c-63e8-4974-b50c-29b0e8db5e9a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.607403 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08d5f1c-63e8-4974-b50c-29b0e8db5e9a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 08 20:35:43 crc kubenswrapper[4781]: E1208 20:35:43.607418 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca1339c-81f9-4ba1-ae68-da8529a5821b" containerName="extract-utilities" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.607426 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca1339c-81f9-4ba1-ae68-da8529a5821b" containerName="extract-utilities" Dec 08 20:35:43 crc kubenswrapper[4781]: E1208 20:35:43.607438 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fca1339c-81f9-4ba1-ae68-da8529a5821b" containerName="registry-server" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.607445 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="fca1339c-81f9-4ba1-ae68-da8529a5821b" containerName="registry-server" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.607651 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c08d5f1c-63e8-4974-b50c-29b0e8db5e9a" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.607676 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="fca1339c-81f9-4ba1-ae68-da8529a5821b" containerName="registry-server" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.608514 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.611226 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.611321 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.611748 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.613662 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns"] Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.615373 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.649558 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7f682bd-1ad1-4917-8c54-7f76ef956f09-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbzns\" (UID: \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.649779 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7f682bd-1ad1-4917-8c54-7f76ef956f09-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbzns\" (UID: \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.649890 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jkqp\" (UniqueName: \"kubernetes.io/projected/c7f682bd-1ad1-4917-8c54-7f76ef956f09-kube-api-access-5jkqp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbzns\" (UID: \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.751905 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7f682bd-1ad1-4917-8c54-7f76ef956f09-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbzns\" (UID: \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.751985 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jkqp\" (UniqueName: \"kubernetes.io/projected/c7f682bd-1ad1-4917-8c54-7f76ef956f09-kube-api-access-5jkqp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbzns\" (UID: \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.752094 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7f682bd-1ad1-4917-8c54-7f76ef956f09-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbzns\" (UID: \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.756518 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7f682bd-1ad1-4917-8c54-7f76ef956f09-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbzns\" (UID: \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.757337 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7f682bd-1ad1-4917-8c54-7f76ef956f09-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbzns\" (UID: \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.769397 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jkqp\" (UniqueName: \"kubernetes.io/projected/c7f682bd-1ad1-4917-8c54-7f76ef956f09-kube-api-access-5jkqp\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hbzns\" (UID: \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" Dec 08 20:35:43 crc kubenswrapper[4781]: I1208 20:35:43.927404 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" Dec 08 20:35:44 crc kubenswrapper[4781]: I1208 20:35:44.488948 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns"] Dec 08 20:35:44 crc kubenswrapper[4781]: I1208 20:35:44.932028 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:35:45 crc kubenswrapper[4781]: I1208 20:35:45.499172 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" event={"ID":"c7f682bd-1ad1-4917-8c54-7f76ef956f09","Type":"ContainerStarted","Data":"67b0e5b32d66b82164b992f45a05267c4b481b764bc629005e7d6f4cfb630465"} Dec 08 20:35:45 crc kubenswrapper[4781]: I1208 20:35:45.499488 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" event={"ID":"c7f682bd-1ad1-4917-8c54-7f76ef956f09","Type":"ContainerStarted","Data":"5bd085b9843162ced4cf7987f4bcdc15fcaec45c3c67d446ed46dc693ddae7f0"} Dec 08 20:35:45 crc kubenswrapper[4781]: I1208 20:35:45.522596 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" podStartSLOduration=2.093304493 podStartE2EDuration="2.522575161s" podCreationTimestamp="2025-12-08 20:35:43 +0000 UTC" firstStartedPulling="2025-12-08 20:35:44.500365218 +0000 UTC m=+1860.651648595" lastFinishedPulling="2025-12-08 20:35:44.929635886 +0000 UTC m=+1861.080919263" observedRunningTime="2025-12-08 20:35:45.515413755 +0000 UTC m=+1861.666697152" watchObservedRunningTime="2025-12-08 20:35:45.522575161 +0000 UTC m=+1861.673858538" Dec 08 20:35:47 crc kubenswrapper[4781]: I1208 20:35:47.037372 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pkxxp"] Dec 08 20:35:47 crc kubenswrapper[4781]: I1208 20:35:47.045636 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pkxxp"] Dec 08 20:35:48 crc kubenswrapper[4781]: I1208 20:35:48.026180 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sqmzb"] Dec 08 20:35:48 crc kubenswrapper[4781]: I1208 20:35:48.033199 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sqmzb"] Dec 08 20:35:48 crc kubenswrapper[4781]: I1208 20:35:48.135987 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb" path="/var/lib/kubelet/pods/2c1ebe9d-4e7a-44a3-84ae-f7f17ded01bb/volumes" Dec 08 20:35:48 crc kubenswrapper[4781]: I1208 20:35:48.137322 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81d1fdcf-4a08-4189-bea7-3e5399286272" path="/var/lib/kubelet/pods/81d1fdcf-4a08-4189-bea7-3e5399286272/volumes" Dec 08 20:35:54 crc kubenswrapper[4781]: I1208 20:35:54.133350 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:35:54 crc kubenswrapper[4781]: E1208 20:35:54.133961 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:36:09 crc kubenswrapper[4781]: I1208 20:36:09.125977 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:36:09 crc kubenswrapper[4781]: E1208 20:36:09.126883 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:36:13 crc kubenswrapper[4781]: I1208 20:36:13.200577 4781 scope.go:117] "RemoveContainer" containerID="8bfd56d687f86072d9c9e144cc9403520c08e92f6bd39dd28b0a4aadc0ccb874" Dec 08 20:36:13 crc kubenswrapper[4781]: I1208 20:36:13.247817 4781 scope.go:117] "RemoveContainer" containerID="924274b0475ca96e689285170a966a56819134a5e52e7295bc9da44021f4f826" Dec 08 20:36:13 crc kubenswrapper[4781]: I1208 20:36:13.292373 4781 scope.go:117] "RemoveContainer" containerID="355847e2b51871732e159f6ef626bfcacbe657c99bf6af0223d8dbcdaeaa060c" Dec 08 20:36:22 crc kubenswrapper[4781]: I1208 20:36:22.126564 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:36:22 crc kubenswrapper[4781]: E1208 20:36:22.127317 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:36:30 crc kubenswrapper[4781]: I1208 20:36:30.885667 4781 generic.go:334] "Generic (PLEG): container finished" podID="c7f682bd-1ad1-4917-8c54-7f76ef956f09" containerID="67b0e5b32d66b82164b992f45a05267c4b481b764bc629005e7d6f4cfb630465" exitCode=0 Dec 08 20:36:30 crc kubenswrapper[4781]: I1208 20:36:30.885755 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" event={"ID":"c7f682bd-1ad1-4917-8c54-7f76ef956f09","Type":"ContainerDied","Data":"67b0e5b32d66b82164b992f45a05267c4b481b764bc629005e7d6f4cfb630465"} Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.034396 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-fq7lx"] Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.043001 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-fq7lx"] Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.142127 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4f9f6a-d72d-41c6-92d1-7b90bdd35123" path="/var/lib/kubelet/pods/5d4f9f6a-d72d-41c6-92d1-7b90bdd35123/volumes" Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.295400 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.338253 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jkqp\" (UniqueName: \"kubernetes.io/projected/c7f682bd-1ad1-4917-8c54-7f76ef956f09-kube-api-access-5jkqp\") pod \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\" (UID: \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\") " Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.338327 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7f682bd-1ad1-4917-8c54-7f76ef956f09-inventory\") pod \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\" (UID: \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\") " Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.338467 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7f682bd-1ad1-4917-8c54-7f76ef956f09-ssh-key\") pod \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\" (UID: \"c7f682bd-1ad1-4917-8c54-7f76ef956f09\") " Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.344671 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7f682bd-1ad1-4917-8c54-7f76ef956f09-kube-api-access-5jkqp" (OuterVolumeSpecName: "kube-api-access-5jkqp") pod "c7f682bd-1ad1-4917-8c54-7f76ef956f09" (UID: "c7f682bd-1ad1-4917-8c54-7f76ef956f09"). InnerVolumeSpecName "kube-api-access-5jkqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.364251 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7f682bd-1ad1-4917-8c54-7f76ef956f09-inventory" (OuterVolumeSpecName: "inventory") pod "c7f682bd-1ad1-4917-8c54-7f76ef956f09" (UID: "c7f682bd-1ad1-4917-8c54-7f76ef956f09"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.371343 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7f682bd-1ad1-4917-8c54-7f76ef956f09-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c7f682bd-1ad1-4917-8c54-7f76ef956f09" (UID: "c7f682bd-1ad1-4917-8c54-7f76ef956f09"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.441142 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jkqp\" (UniqueName: \"kubernetes.io/projected/c7f682bd-1ad1-4917-8c54-7f76ef956f09-kube-api-access-5jkqp\") on node \"crc\" DevicePath \"\"" Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.441175 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7f682bd-1ad1-4917-8c54-7f76ef956f09-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.441185 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7f682bd-1ad1-4917-8c54-7f76ef956f09-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.908253 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" event={"ID":"c7f682bd-1ad1-4917-8c54-7f76ef956f09","Type":"ContainerDied","Data":"5bd085b9843162ced4cf7987f4bcdc15fcaec45c3c67d446ed46dc693ddae7f0"} Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.908297 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bd085b9843162ced4cf7987f4bcdc15fcaec45c3c67d446ed46dc693ddae7f0" Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.908364 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hbzns" Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.998090 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b4xf8"] Dec 08 20:36:32 crc kubenswrapper[4781]: E1208 20:36:32.998506 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7f682bd-1ad1-4917-8c54-7f76ef956f09" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.998528 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7f682bd-1ad1-4917-8c54-7f76ef956f09" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.998771 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7f682bd-1ad1-4917-8c54-7f76ef956f09" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 08 20:36:32 crc kubenswrapper[4781]: I1208 20:36:32.999489 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.002456 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.003269 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.004029 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.005957 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.010960 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b4xf8"] Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.053821 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/857a2e5e-21b6-450e-8578-240e91f419f7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b4xf8\" (UID: \"857a2e5e-21b6-450e-8578-240e91f419f7\") " pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.054001 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/857a2e5e-21b6-450e-8578-240e91f419f7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b4xf8\" (UID: \"857a2e5e-21b6-450e-8578-240e91f419f7\") " pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.054340 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhkzf\" (UniqueName: \"kubernetes.io/projected/857a2e5e-21b6-450e-8578-240e91f419f7-kube-api-access-jhkzf\") pod \"ssh-known-hosts-edpm-deployment-b4xf8\" (UID: \"857a2e5e-21b6-450e-8578-240e91f419f7\") " pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.156530 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhkzf\" (UniqueName: \"kubernetes.io/projected/857a2e5e-21b6-450e-8578-240e91f419f7-kube-api-access-jhkzf\") pod \"ssh-known-hosts-edpm-deployment-b4xf8\" (UID: \"857a2e5e-21b6-450e-8578-240e91f419f7\") " pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.156599 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/857a2e5e-21b6-450e-8578-240e91f419f7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b4xf8\" (UID: \"857a2e5e-21b6-450e-8578-240e91f419f7\") " pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.156655 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/857a2e5e-21b6-450e-8578-240e91f419f7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b4xf8\" (UID: \"857a2e5e-21b6-450e-8578-240e91f419f7\") " pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.160089 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/857a2e5e-21b6-450e-8578-240e91f419f7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b4xf8\" (UID: \"857a2e5e-21b6-450e-8578-240e91f419f7\") " pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.160853 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/857a2e5e-21b6-450e-8578-240e91f419f7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b4xf8\" (UID: \"857a2e5e-21b6-450e-8578-240e91f419f7\") " pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.178448 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhkzf\" (UniqueName: \"kubernetes.io/projected/857a2e5e-21b6-450e-8578-240e91f419f7-kube-api-access-jhkzf\") pod \"ssh-known-hosts-edpm-deployment-b4xf8\" (UID: \"857a2e5e-21b6-450e-8578-240e91f419f7\") " pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.319112 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.863554 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b4xf8"] Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.868263 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 20:36:33 crc kubenswrapper[4781]: I1208 20:36:33.917376 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" event={"ID":"857a2e5e-21b6-450e-8578-240e91f419f7","Type":"ContainerStarted","Data":"c828005a1cad8d8e4e6250dd7e84fe089a371cccc983ad3f634ea65dc8ac5679"} Dec 08 20:36:34 crc kubenswrapper[4781]: I1208 20:36:34.134218 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:36:34 crc kubenswrapper[4781]: E1208 20:36:34.134516 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:36:34 crc kubenswrapper[4781]: I1208 20:36:34.926022 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" event={"ID":"857a2e5e-21b6-450e-8578-240e91f419f7","Type":"ContainerStarted","Data":"fd2bb7afaa6a46b8eb6d12771d7db395cb505eb53048624ff3e61b82433ce8bf"} Dec 08 20:36:34 crc kubenswrapper[4781]: I1208 20:36:34.949092 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" podStartSLOduration=2.245520503 podStartE2EDuration="2.949073592s" podCreationTimestamp="2025-12-08 20:36:32 +0000 UTC" firstStartedPulling="2025-12-08 20:36:33.868029294 +0000 UTC m=+1910.019312671" lastFinishedPulling="2025-12-08 20:36:34.571582383 +0000 UTC m=+1910.722865760" observedRunningTime="2025-12-08 20:36:34.942286816 +0000 UTC m=+1911.093570193" watchObservedRunningTime="2025-12-08 20:36:34.949073592 +0000 UTC m=+1911.100356969" Dec 08 20:36:41 crc kubenswrapper[4781]: I1208 20:36:41.983755 4781 generic.go:334] "Generic (PLEG): container finished" podID="857a2e5e-21b6-450e-8578-240e91f419f7" containerID="fd2bb7afaa6a46b8eb6d12771d7db395cb505eb53048624ff3e61b82433ce8bf" exitCode=0 Dec 08 20:36:41 crc kubenswrapper[4781]: I1208 20:36:41.983871 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" event={"ID":"857a2e5e-21b6-450e-8578-240e91f419f7","Type":"ContainerDied","Data":"fd2bb7afaa6a46b8eb6d12771d7db395cb505eb53048624ff3e61b82433ce8bf"} Dec 08 20:36:43 crc kubenswrapper[4781]: I1208 20:36:43.386304 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" Dec 08 20:36:43 crc kubenswrapper[4781]: I1208 20:36:43.445627 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhkzf\" (UniqueName: \"kubernetes.io/projected/857a2e5e-21b6-450e-8578-240e91f419f7-kube-api-access-jhkzf\") pod \"857a2e5e-21b6-450e-8578-240e91f419f7\" (UID: \"857a2e5e-21b6-450e-8578-240e91f419f7\") " Dec 08 20:36:43 crc kubenswrapper[4781]: I1208 20:36:43.445666 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/857a2e5e-21b6-450e-8578-240e91f419f7-ssh-key-openstack-edpm-ipam\") pod \"857a2e5e-21b6-450e-8578-240e91f419f7\" (UID: \"857a2e5e-21b6-450e-8578-240e91f419f7\") " Dec 08 20:36:43 crc kubenswrapper[4781]: I1208 20:36:43.445941 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/857a2e5e-21b6-450e-8578-240e91f419f7-inventory-0\") pod \"857a2e5e-21b6-450e-8578-240e91f419f7\" (UID: \"857a2e5e-21b6-450e-8578-240e91f419f7\") " Dec 08 20:36:43 crc kubenswrapper[4781]: I1208 20:36:43.450973 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/857a2e5e-21b6-450e-8578-240e91f419f7-kube-api-access-jhkzf" (OuterVolumeSpecName: "kube-api-access-jhkzf") pod "857a2e5e-21b6-450e-8578-240e91f419f7" (UID: "857a2e5e-21b6-450e-8578-240e91f419f7"). InnerVolumeSpecName "kube-api-access-jhkzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:36:43 crc kubenswrapper[4781]: I1208 20:36:43.475685 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/857a2e5e-21b6-450e-8578-240e91f419f7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "857a2e5e-21b6-450e-8578-240e91f419f7" (UID: "857a2e5e-21b6-450e-8578-240e91f419f7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:36:43 crc kubenswrapper[4781]: I1208 20:36:43.478185 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/857a2e5e-21b6-450e-8578-240e91f419f7-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "857a2e5e-21b6-450e-8578-240e91f419f7" (UID: "857a2e5e-21b6-450e-8578-240e91f419f7"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:36:43 crc kubenswrapper[4781]: I1208 20:36:43.547521 4781 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/857a2e5e-21b6-450e-8578-240e91f419f7-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:36:43 crc kubenswrapper[4781]: I1208 20:36:43.547555 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhkzf\" (UniqueName: \"kubernetes.io/projected/857a2e5e-21b6-450e-8578-240e91f419f7-kube-api-access-jhkzf\") on node \"crc\" DevicePath \"\"" Dec 08 20:36:43 crc kubenswrapper[4781]: I1208 20:36:43.547565 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/857a2e5e-21b6-450e-8578-240e91f419f7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.001676 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" event={"ID":"857a2e5e-21b6-450e-8578-240e91f419f7","Type":"ContainerDied","Data":"c828005a1cad8d8e4e6250dd7e84fe089a371cccc983ad3f634ea65dc8ac5679"} Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.001706 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b4xf8" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.001710 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c828005a1cad8d8e4e6250dd7e84fe089a371cccc983ad3f634ea65dc8ac5679" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.068570 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq"] Dec 08 20:36:44 crc kubenswrapper[4781]: E1208 20:36:44.068958 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857a2e5e-21b6-450e-8578-240e91f419f7" containerName="ssh-known-hosts-edpm-deployment" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.068975 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="857a2e5e-21b6-450e-8578-240e91f419f7" containerName="ssh-known-hosts-edpm-deployment" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.069167 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="857a2e5e-21b6-450e-8578-240e91f419f7" containerName="ssh-known-hosts-edpm-deployment" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.069761 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.073536 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.073818 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.074018 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.074265 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.080544 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq"] Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.156741 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nknxm\" (UniqueName: \"kubernetes.io/projected/96db8e7d-bc3a-4804-af50-6f403dbbcc26-kube-api-access-nknxm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xxkqq\" (UID: \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.156790 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96db8e7d-bc3a-4804-af50-6f403dbbcc26-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xxkqq\" (UID: \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.156860 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96db8e7d-bc3a-4804-af50-6f403dbbcc26-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xxkqq\" (UID: \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.257976 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96db8e7d-bc3a-4804-af50-6f403dbbcc26-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xxkqq\" (UID: \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.258750 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nknxm\" (UniqueName: \"kubernetes.io/projected/96db8e7d-bc3a-4804-af50-6f403dbbcc26-kube-api-access-nknxm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xxkqq\" (UID: \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.258870 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96db8e7d-bc3a-4804-af50-6f403dbbcc26-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xxkqq\" (UID: \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.259808 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.262179 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.272491 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96db8e7d-bc3a-4804-af50-6f403dbbcc26-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xxkqq\" (UID: \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.273437 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96db8e7d-bc3a-4804-af50-6f403dbbcc26-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xxkqq\" (UID: \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.275042 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nknxm\" (UniqueName: \"kubernetes.io/projected/96db8e7d-bc3a-4804-af50-6f403dbbcc26-kube-api-access-nknxm\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-xxkqq\" (UID: \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.390217 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.398345 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" Dec 08 20:36:44 crc kubenswrapper[4781]: I1208 20:36:44.938496 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq"] Dec 08 20:36:45 crc kubenswrapper[4781]: I1208 20:36:45.011782 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" event={"ID":"96db8e7d-bc3a-4804-af50-6f403dbbcc26","Type":"ContainerStarted","Data":"5cf6b611bd4941d18dfbd015386866a6859ff206452854ba202e212e422c8927"} Dec 08 20:36:45 crc kubenswrapper[4781]: I1208 20:36:45.524816 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:36:46 crc kubenswrapper[4781]: I1208 20:36:46.023275 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" event={"ID":"96db8e7d-bc3a-4804-af50-6f403dbbcc26","Type":"ContainerStarted","Data":"cbd314ac9945f09e1308accfc061de1ba5eaa3032cc5c1b3913dd3e3194a18e1"} Dec 08 20:36:46 crc kubenswrapper[4781]: I1208 20:36:46.045511 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" podStartSLOduration=1.467816861 podStartE2EDuration="2.045493239s" podCreationTimestamp="2025-12-08 20:36:44 +0000 UTC" firstStartedPulling="2025-12-08 20:36:44.944849068 +0000 UTC m=+1921.096132445" lastFinishedPulling="2025-12-08 20:36:45.522525446 +0000 UTC m=+1921.673808823" observedRunningTime="2025-12-08 20:36:46.037209321 +0000 UTC m=+1922.188492698" watchObservedRunningTime="2025-12-08 20:36:46.045493239 +0000 UTC m=+1922.196776606" Dec 08 20:36:48 crc kubenswrapper[4781]: I1208 20:36:48.126517 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:36:48 crc kubenswrapper[4781]: E1208 20:36:48.127202 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:36:54 crc kubenswrapper[4781]: I1208 20:36:54.115012 4781 generic.go:334] "Generic (PLEG): container finished" podID="96db8e7d-bc3a-4804-af50-6f403dbbcc26" containerID="cbd314ac9945f09e1308accfc061de1ba5eaa3032cc5c1b3913dd3e3194a18e1" exitCode=0 Dec 08 20:36:54 crc kubenswrapper[4781]: I1208 20:36:54.115091 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" event={"ID":"96db8e7d-bc3a-4804-af50-6f403dbbcc26","Type":"ContainerDied","Data":"cbd314ac9945f09e1308accfc061de1ba5eaa3032cc5c1b3913dd3e3194a18e1"} Dec 08 20:36:55 crc kubenswrapper[4781]: I1208 20:36:55.514324 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" Dec 08 20:36:55 crc kubenswrapper[4781]: I1208 20:36:55.591068 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nknxm\" (UniqueName: \"kubernetes.io/projected/96db8e7d-bc3a-4804-af50-6f403dbbcc26-kube-api-access-nknxm\") pod \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\" (UID: \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\") " Dec 08 20:36:55 crc kubenswrapper[4781]: I1208 20:36:55.591306 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96db8e7d-bc3a-4804-af50-6f403dbbcc26-inventory\") pod \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\" (UID: \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\") " Dec 08 20:36:55 crc kubenswrapper[4781]: I1208 20:36:55.591437 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96db8e7d-bc3a-4804-af50-6f403dbbcc26-ssh-key\") pod \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\" (UID: \"96db8e7d-bc3a-4804-af50-6f403dbbcc26\") " Dec 08 20:36:55 crc kubenswrapper[4781]: I1208 20:36:55.598227 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96db8e7d-bc3a-4804-af50-6f403dbbcc26-kube-api-access-nknxm" (OuterVolumeSpecName: "kube-api-access-nknxm") pod "96db8e7d-bc3a-4804-af50-6f403dbbcc26" (UID: "96db8e7d-bc3a-4804-af50-6f403dbbcc26"). InnerVolumeSpecName "kube-api-access-nknxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:36:55 crc kubenswrapper[4781]: I1208 20:36:55.617543 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96db8e7d-bc3a-4804-af50-6f403dbbcc26-inventory" (OuterVolumeSpecName: "inventory") pod "96db8e7d-bc3a-4804-af50-6f403dbbcc26" (UID: "96db8e7d-bc3a-4804-af50-6f403dbbcc26"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:36:55 crc kubenswrapper[4781]: I1208 20:36:55.618243 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96db8e7d-bc3a-4804-af50-6f403dbbcc26-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "96db8e7d-bc3a-4804-af50-6f403dbbcc26" (UID: "96db8e7d-bc3a-4804-af50-6f403dbbcc26"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:36:55 crc kubenswrapper[4781]: I1208 20:36:55.693365 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96db8e7d-bc3a-4804-af50-6f403dbbcc26-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:36:55 crc kubenswrapper[4781]: I1208 20:36:55.693408 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96db8e7d-bc3a-4804-af50-6f403dbbcc26-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:36:55 crc kubenswrapper[4781]: I1208 20:36:55.693417 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nknxm\" (UniqueName: \"kubernetes.io/projected/96db8e7d-bc3a-4804-af50-6f403dbbcc26-kube-api-access-nknxm\") on node \"crc\" DevicePath \"\"" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.135582 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.145176 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-xxkqq" event={"ID":"96db8e7d-bc3a-4804-af50-6f403dbbcc26","Type":"ContainerDied","Data":"5cf6b611bd4941d18dfbd015386866a6859ff206452854ba202e212e422c8927"} Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.148162 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cf6b611bd4941d18dfbd015386866a6859ff206452854ba202e212e422c8927" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.214209 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx"] Dec 08 20:36:56 crc kubenswrapper[4781]: E1208 20:36:56.214964 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96db8e7d-bc3a-4804-af50-6f403dbbcc26" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.214981 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="96db8e7d-bc3a-4804-af50-6f403dbbcc26" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.215174 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="96db8e7d-bc3a-4804-af50-6f403dbbcc26" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.215794 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.225085 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx"] Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.225735 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.225818 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.226377 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.226407 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.304247 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29315241-935b-40dc-b49d-d8f18cbb4d38-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx\" (UID: \"29315241-935b-40dc-b49d-d8f18cbb4d38\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.304324 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29315241-935b-40dc-b49d-d8f18cbb4d38-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx\" (UID: \"29315241-935b-40dc-b49d-d8f18cbb4d38\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.304571 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rdr4\" (UniqueName: \"kubernetes.io/projected/29315241-935b-40dc-b49d-d8f18cbb4d38-kube-api-access-4rdr4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx\" (UID: \"29315241-935b-40dc-b49d-d8f18cbb4d38\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.406777 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rdr4\" (UniqueName: \"kubernetes.io/projected/29315241-935b-40dc-b49d-d8f18cbb4d38-kube-api-access-4rdr4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx\" (UID: \"29315241-935b-40dc-b49d-d8f18cbb4d38\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.406970 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29315241-935b-40dc-b49d-d8f18cbb4d38-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx\" (UID: \"29315241-935b-40dc-b49d-d8f18cbb4d38\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.407018 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29315241-935b-40dc-b49d-d8f18cbb4d38-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx\" (UID: \"29315241-935b-40dc-b49d-d8f18cbb4d38\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.413247 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29315241-935b-40dc-b49d-d8f18cbb4d38-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx\" (UID: \"29315241-935b-40dc-b49d-d8f18cbb4d38\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.414135 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29315241-935b-40dc-b49d-d8f18cbb4d38-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx\" (UID: \"29315241-935b-40dc-b49d-d8f18cbb4d38\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.427812 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rdr4\" (UniqueName: \"kubernetes.io/projected/29315241-935b-40dc-b49d-d8f18cbb4d38-kube-api-access-4rdr4\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx\" (UID: \"29315241-935b-40dc-b49d-d8f18cbb4d38\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" Dec 08 20:36:56 crc kubenswrapper[4781]: I1208 20:36:56.540786 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" Dec 08 20:36:57 crc kubenswrapper[4781]: I1208 20:36:57.095191 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx"] Dec 08 20:36:57 crc kubenswrapper[4781]: I1208 20:36:57.144969 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" event={"ID":"29315241-935b-40dc-b49d-d8f18cbb4d38","Type":"ContainerStarted","Data":"31cf6fce1bc4199998f3a2ae458fcb18d23ca14c72512e15532245873398ea96"} Dec 08 20:36:58 crc kubenswrapper[4781]: I1208 20:36:58.153959 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" event={"ID":"29315241-935b-40dc-b49d-d8f18cbb4d38","Type":"ContainerStarted","Data":"8b705507740c634472e2d0712429f96baf8c7c47fa8cef6545c74b403214ffac"} Dec 08 20:36:58 crc kubenswrapper[4781]: I1208 20:36:58.196850 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" podStartSLOduration=1.782181312 podStartE2EDuration="2.19682278s" podCreationTimestamp="2025-12-08 20:36:56 +0000 UTC" firstStartedPulling="2025-12-08 20:36:57.105869487 +0000 UTC m=+1933.257152874" lastFinishedPulling="2025-12-08 20:36:57.520510965 +0000 UTC m=+1933.671794342" observedRunningTime="2025-12-08 20:36:58.18675579 +0000 UTC m=+1934.338039187" watchObservedRunningTime="2025-12-08 20:36:58.19682278 +0000 UTC m=+1934.348106157" Dec 08 20:37:03 crc kubenswrapper[4781]: I1208 20:37:03.127152 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:37:03 crc kubenswrapper[4781]: E1208 20:37:03.128379 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:37:07 crc kubenswrapper[4781]: I1208 20:37:07.226852 4781 generic.go:334] "Generic (PLEG): container finished" podID="29315241-935b-40dc-b49d-d8f18cbb4d38" containerID="8b705507740c634472e2d0712429f96baf8c7c47fa8cef6545c74b403214ffac" exitCode=0 Dec 08 20:37:07 crc kubenswrapper[4781]: I1208 20:37:07.227173 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" event={"ID":"29315241-935b-40dc-b49d-d8f18cbb4d38","Type":"ContainerDied","Data":"8b705507740c634472e2d0712429f96baf8c7c47fa8cef6545c74b403214ffac"} Dec 08 20:37:08 crc kubenswrapper[4781]: I1208 20:37:08.700373 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" Dec 08 20:37:08 crc kubenswrapper[4781]: I1208 20:37:08.839882 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29315241-935b-40dc-b49d-d8f18cbb4d38-inventory\") pod \"29315241-935b-40dc-b49d-d8f18cbb4d38\" (UID: \"29315241-935b-40dc-b49d-d8f18cbb4d38\") " Dec 08 20:37:08 crc kubenswrapper[4781]: I1208 20:37:08.840130 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rdr4\" (UniqueName: \"kubernetes.io/projected/29315241-935b-40dc-b49d-d8f18cbb4d38-kube-api-access-4rdr4\") pod \"29315241-935b-40dc-b49d-d8f18cbb4d38\" (UID: \"29315241-935b-40dc-b49d-d8f18cbb4d38\") " Dec 08 20:37:08 crc kubenswrapper[4781]: I1208 20:37:08.840199 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29315241-935b-40dc-b49d-d8f18cbb4d38-ssh-key\") pod \"29315241-935b-40dc-b49d-d8f18cbb4d38\" (UID: \"29315241-935b-40dc-b49d-d8f18cbb4d38\") " Dec 08 20:37:08 crc kubenswrapper[4781]: I1208 20:37:08.845194 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29315241-935b-40dc-b49d-d8f18cbb4d38-kube-api-access-4rdr4" (OuterVolumeSpecName: "kube-api-access-4rdr4") pod "29315241-935b-40dc-b49d-d8f18cbb4d38" (UID: "29315241-935b-40dc-b49d-d8f18cbb4d38"). InnerVolumeSpecName "kube-api-access-4rdr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:37:08 crc kubenswrapper[4781]: I1208 20:37:08.869206 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29315241-935b-40dc-b49d-d8f18cbb4d38-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29315241-935b-40dc-b49d-d8f18cbb4d38" (UID: "29315241-935b-40dc-b49d-d8f18cbb4d38"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:37:08 crc kubenswrapper[4781]: I1208 20:37:08.872147 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29315241-935b-40dc-b49d-d8f18cbb4d38-inventory" (OuterVolumeSpecName: "inventory") pod "29315241-935b-40dc-b49d-d8f18cbb4d38" (UID: "29315241-935b-40dc-b49d-d8f18cbb4d38"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:37:08 crc kubenswrapper[4781]: I1208 20:37:08.942014 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rdr4\" (UniqueName: \"kubernetes.io/projected/29315241-935b-40dc-b49d-d8f18cbb4d38-kube-api-access-4rdr4\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:08 crc kubenswrapper[4781]: I1208 20:37:08.942066 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29315241-935b-40dc-b49d-d8f18cbb4d38-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:08 crc kubenswrapper[4781]: I1208 20:37:08.942077 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29315241-935b-40dc-b49d-d8f18cbb4d38-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.246270 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" event={"ID":"29315241-935b-40dc-b49d-d8f18cbb4d38","Type":"ContainerDied","Data":"31cf6fce1bc4199998f3a2ae458fcb18d23ca14c72512e15532245873398ea96"} Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.246644 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31cf6fce1bc4199998f3a2ae458fcb18d23ca14c72512e15532245873398ea96" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.246501 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.336281 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf"] Dec 08 20:37:09 crc kubenswrapper[4781]: E1208 20:37:09.336754 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29315241-935b-40dc-b49d-d8f18cbb4d38" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.336779 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="29315241-935b-40dc-b49d-d8f18cbb4d38" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.337106 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="29315241-935b-40dc-b49d-d8f18cbb4d38" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.337860 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.343665 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.343964 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.344201 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.344249 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.344272 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.344414 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.344417 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.345160 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.354177 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf"] Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.450818 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.450876 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.450906 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.451340 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwb99\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-kube-api-access-gwb99\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.451531 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.451614 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.451804 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.451993 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.452080 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.452215 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.452271 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.452300 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.452341 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.452473 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.553850 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.553893 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.553975 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.554009 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.554026 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.554168 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.554197 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.554219 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.554248 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.554295 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.554328 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.554348 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.554368 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.554392 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwb99\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-kube-api-access-gwb99\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.561033 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.561377 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.561632 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.563146 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.563164 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.563238 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.564053 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.565413 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.565507 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.565544 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.566317 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.567550 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.567766 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.578739 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwb99\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-kube-api-access-gwb99\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-z64vf\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:09 crc kubenswrapper[4781]: I1208 20:37:09.654836 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:10 crc kubenswrapper[4781]: I1208 20:37:10.162723 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf"] Dec 08 20:37:10 crc kubenswrapper[4781]: I1208 20:37:10.257517 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" event={"ID":"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e","Type":"ContainerStarted","Data":"87c40a6435a237b3f97407d9bb843036190613b02ddef7e100dac6af0ceebc08"} Dec 08 20:37:11 crc kubenswrapper[4781]: I1208 20:37:11.271862 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" event={"ID":"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e","Type":"ContainerStarted","Data":"07643c8d6cc04bb0b5bd333669ad0917f40a07cfe45423992c17185c5b3c2fe4"} Dec 08 20:37:11 crc kubenswrapper[4781]: I1208 20:37:11.293808 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" podStartSLOduration=1.879234404 podStartE2EDuration="2.293786287s" podCreationTimestamp="2025-12-08 20:37:09 +0000 UTC" firstStartedPulling="2025-12-08 20:37:10.166055837 +0000 UTC m=+1946.317339214" lastFinishedPulling="2025-12-08 20:37:10.58060772 +0000 UTC m=+1946.731891097" observedRunningTime="2025-12-08 20:37:11.292722056 +0000 UTC m=+1947.444005453" watchObservedRunningTime="2025-12-08 20:37:11.293786287 +0000 UTC m=+1947.445069674" Dec 08 20:37:13 crc kubenswrapper[4781]: I1208 20:37:13.418963 4781 scope.go:117] "RemoveContainer" containerID="26c7fb32a9c94a7dbfda9ab0b2feccf37c902135b5a6b9566c3039906bbb3aef" Dec 08 20:37:14 crc kubenswrapper[4781]: I1208 20:37:14.132640 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:37:14 crc kubenswrapper[4781]: E1208 20:37:14.133023 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:37:27 crc kubenswrapper[4781]: I1208 20:37:27.125907 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:37:27 crc kubenswrapper[4781]: E1208 20:37:27.126656 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:37:42 crc kubenswrapper[4781]: I1208 20:37:42.125647 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:37:42 crc kubenswrapper[4781]: E1208 20:37:42.126762 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:37:46 crc kubenswrapper[4781]: I1208 20:37:46.577649 4781 generic.go:334] "Generic (PLEG): container finished" podID="98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" containerID="07643c8d6cc04bb0b5bd333669ad0917f40a07cfe45423992c17185c5b3c2fe4" exitCode=0 Dec 08 20:37:46 crc kubenswrapper[4781]: I1208 20:37:46.577759 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" event={"ID":"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e","Type":"ContainerDied","Data":"07643c8d6cc04bb0b5bd333669ad0917f40a07cfe45423992c17185c5b3c2fe4"} Dec 08 20:37:47 crc kubenswrapper[4781]: I1208 20:37:47.981907 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.009616 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-libvirt-combined-ca-bundle\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.009715 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.009762 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwb99\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-kube-api-access-gwb99\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.009807 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-neutron-metadata-combined-ca-bundle\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.009830 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-ovn-combined-ca-bundle\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.009865 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.009934 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-nova-combined-ca-bundle\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.009977 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-ovn-default-certs-0\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.010003 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-ssh-key\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.010036 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-repo-setup-combined-ca-bundle\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.010061 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-telemetry-combined-ca-bundle\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.010094 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-bootstrap-combined-ca-bundle\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.010141 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-inventory\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.010160 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\" (UID: \"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e\") " Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.016190 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.017226 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.019189 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.019776 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.021819 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.026201 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.026367 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.026475 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.026517 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.027434 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.028011 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-kube-api-access-gwb99" (OuterVolumeSpecName: "kube-api-access-gwb99") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "kube-api-access-gwb99". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.029168 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.048719 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.071262 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-inventory" (OuterVolumeSpecName: "inventory") pod "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" (UID: "98d6ed01-20fc-4e72-a8cf-2e53a8e6103e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.112871 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.112965 4781 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.112983 4781 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.112996 4781 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.113008 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.113022 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.113035 4781 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.113047 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.113059 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwb99\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-kube-api-access-gwb99\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.113072 4781 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.113084 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.113097 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.113112 4781 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.113125 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98d6ed01-20fc-4e72-a8cf-2e53a8e6103e-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.605352 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" event={"ID":"98d6ed01-20fc-4e72-a8cf-2e53a8e6103e","Type":"ContainerDied","Data":"87c40a6435a237b3f97407d9bb843036190613b02ddef7e100dac6af0ceebc08"} Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.605403 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87c40a6435a237b3f97407d9bb843036190613b02ddef7e100dac6af0ceebc08" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.605441 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-z64vf" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.791205 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf"] Dec 08 20:37:48 crc kubenswrapper[4781]: E1208 20:37:48.791676 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.791701 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.791959 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d6ed01-20fc-4e72-a8cf-2e53a8e6103e" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.792784 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.800653 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.800868 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.801056 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.801614 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.808692 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf"] Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.817567 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.940612 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjj86\" (UniqueName: \"kubernetes.io/projected/8f4acc2b-373a-48e5-916b-a0fcfcb83851-kube-api-access-cjj86\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.940684 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.940729 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.940822 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:48 crc kubenswrapper[4781]: I1208 20:37:48.940874 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:49 crc kubenswrapper[4781]: I1208 20:37:49.042334 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjj86\" (UniqueName: \"kubernetes.io/projected/8f4acc2b-373a-48e5-916b-a0fcfcb83851-kube-api-access-cjj86\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:49 crc kubenswrapper[4781]: I1208 20:37:49.042404 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:49 crc kubenswrapper[4781]: I1208 20:37:49.042465 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:49 crc kubenswrapper[4781]: I1208 20:37:49.042528 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:49 crc kubenswrapper[4781]: I1208 20:37:49.042590 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:49 crc kubenswrapper[4781]: I1208 20:37:49.043595 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:49 crc kubenswrapper[4781]: I1208 20:37:49.047120 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:49 crc kubenswrapper[4781]: I1208 20:37:49.047377 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:49 crc kubenswrapper[4781]: I1208 20:37:49.047555 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:49 crc kubenswrapper[4781]: I1208 20:37:49.059898 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjj86\" (UniqueName: \"kubernetes.io/projected/8f4acc2b-373a-48e5-916b-a0fcfcb83851-kube-api-access-cjj86\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-4wchf\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:49 crc kubenswrapper[4781]: I1208 20:37:49.111869 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:37:49 crc kubenswrapper[4781]: I1208 20:37:49.729398 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf"] Dec 08 20:37:50 crc kubenswrapper[4781]: I1208 20:37:50.632322 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" event={"ID":"8f4acc2b-373a-48e5-916b-a0fcfcb83851","Type":"ContainerStarted","Data":"3982b6a64751885a075614e18881c1d2ca4eb387c39e4404e07c170cb4274a0a"} Dec 08 20:37:50 crc kubenswrapper[4781]: I1208 20:37:50.632617 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" event={"ID":"8f4acc2b-373a-48e5-916b-a0fcfcb83851","Type":"ContainerStarted","Data":"9ca902c3c460dbc0451b9914aceee4d2ebb137fd2c7fad758fea6a556cfc3885"} Dec 08 20:37:50 crc kubenswrapper[4781]: I1208 20:37:50.663841 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" podStartSLOduration=2.071182606 podStartE2EDuration="2.663815451s" podCreationTimestamp="2025-12-08 20:37:48 +0000 UTC" firstStartedPulling="2025-12-08 20:37:49.743885076 +0000 UTC m=+1985.895168443" lastFinishedPulling="2025-12-08 20:37:50.336517911 +0000 UTC m=+1986.487801288" observedRunningTime="2025-12-08 20:37:50.649485368 +0000 UTC m=+1986.800768745" watchObservedRunningTime="2025-12-08 20:37:50.663815451 +0000 UTC m=+1986.815098828" Dec 08 20:37:53 crc kubenswrapper[4781]: I1208 20:37:53.126454 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:37:53 crc kubenswrapper[4781]: E1208 20:37:53.127119 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:38:04 crc kubenswrapper[4781]: I1208 20:38:04.132726 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:38:04 crc kubenswrapper[4781]: I1208 20:38:04.782689 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"4ecf2f2c52e8a643c4ad224bdcf72c4081cef5a126f0f1cdf084fd2fe65b6b95"} Dec 08 20:38:51 crc kubenswrapper[4781]: I1208 20:38:51.271993 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" event={"ID":"8f4acc2b-373a-48e5-916b-a0fcfcb83851","Type":"ContainerDied","Data":"3982b6a64751885a075614e18881c1d2ca4eb387c39e4404e07c170cb4274a0a"} Dec 08 20:38:51 crc kubenswrapper[4781]: I1208 20:38:51.271902 4781 generic.go:334] "Generic (PLEG): container finished" podID="8f4acc2b-373a-48e5-916b-a0fcfcb83851" containerID="3982b6a64751885a075614e18881c1d2ca4eb387c39e4404e07c170cb4274a0a" exitCode=0 Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.679792 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.845932 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ovn-combined-ca-bundle\") pod \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.846107 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjj86\" (UniqueName: \"kubernetes.io/projected/8f4acc2b-373a-48e5-916b-a0fcfcb83851-kube-api-access-cjj86\") pod \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.846145 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ssh-key\") pod \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.846170 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-inventory\") pod \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.846227 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ovncontroller-config-0\") pod \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\" (UID: \"8f4acc2b-373a-48e5-916b-a0fcfcb83851\") " Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.855712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8f4acc2b-373a-48e5-916b-a0fcfcb83851" (UID: "8f4acc2b-373a-48e5-916b-a0fcfcb83851"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.859483 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f4acc2b-373a-48e5-916b-a0fcfcb83851-kube-api-access-cjj86" (OuterVolumeSpecName: "kube-api-access-cjj86") pod "8f4acc2b-373a-48e5-916b-a0fcfcb83851" (UID: "8f4acc2b-373a-48e5-916b-a0fcfcb83851"). InnerVolumeSpecName "kube-api-access-cjj86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.874809 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-inventory" (OuterVolumeSpecName: "inventory") pod "8f4acc2b-373a-48e5-916b-a0fcfcb83851" (UID: "8f4acc2b-373a-48e5-916b-a0fcfcb83851"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.875260 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8f4acc2b-373a-48e5-916b-a0fcfcb83851" (UID: "8f4acc2b-373a-48e5-916b-a0fcfcb83851"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.888402 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8f4acc2b-373a-48e5-916b-a0fcfcb83851" (UID: "8f4acc2b-373a-48e5-916b-a0fcfcb83851"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.948593 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjj86\" (UniqueName: \"kubernetes.io/projected/8f4acc2b-373a-48e5-916b-a0fcfcb83851-kube-api-access-cjj86\") on node \"crc\" DevicePath \"\"" Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.948640 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.948653 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.948666 4781 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:38:52 crc kubenswrapper[4781]: I1208 20:38:52.948678 4781 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f4acc2b-373a-48e5-916b-a0fcfcb83851-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.294385 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" event={"ID":"8f4acc2b-373a-48e5-916b-a0fcfcb83851","Type":"ContainerDied","Data":"9ca902c3c460dbc0451b9914aceee4d2ebb137fd2c7fad758fea6a556cfc3885"} Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.294445 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-4wchf" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.294455 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ca902c3c460dbc0451b9914aceee4d2ebb137fd2c7fad758fea6a556cfc3885" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.401208 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh"] Dec 08 20:38:53 crc kubenswrapper[4781]: E1208 20:38:53.401643 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4acc2b-373a-48e5-916b-a0fcfcb83851" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.401659 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4acc2b-373a-48e5-916b-a0fcfcb83851" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.401852 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4acc2b-373a-48e5-916b-a0fcfcb83851" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.402504 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.407133 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.407520 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.407615 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.410204 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.414487 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.421962 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.428280 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh"] Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.561071 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.561136 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hmxx\" (UniqueName: \"kubernetes.io/projected/f0d82d95-8bf8-4845-a305-cca05358ffdb-kube-api-access-6hmxx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.561172 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.561193 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.561347 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.561504 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.663153 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.663236 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hmxx\" (UniqueName: \"kubernetes.io/projected/f0d82d95-8bf8-4845-a305-cca05358ffdb-kube-api-access-6hmxx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.663281 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.663309 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.663367 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.663439 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.672754 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.674755 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.681613 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.689683 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.694595 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.706843 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hmxx\" (UniqueName: \"kubernetes.io/projected/f0d82d95-8bf8-4845-a305-cca05358ffdb-kube-api-access-6hmxx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:53 crc kubenswrapper[4781]: I1208 20:38:53.719354 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:38:54 crc kubenswrapper[4781]: I1208 20:38:54.260686 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh"] Dec 08 20:38:54 crc kubenswrapper[4781]: I1208 20:38:54.303251 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" event={"ID":"f0d82d95-8bf8-4845-a305-cca05358ffdb","Type":"ContainerStarted","Data":"5940c407f9da7b6abae2fae73c3efdc651a639bee867c96a4158fbdcab8c1fed"} Dec 08 20:38:55 crc kubenswrapper[4781]: I1208 20:38:55.312814 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" event={"ID":"f0d82d95-8bf8-4845-a305-cca05358ffdb","Type":"ContainerStarted","Data":"f5465a042561df059599878fd32f6a2340173a4518291e1936b2e2563cce5625"} Dec 08 20:38:55 crc kubenswrapper[4781]: I1208 20:38:55.331569 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" podStartSLOduration=1.9381797939999998 podStartE2EDuration="2.331548047s" podCreationTimestamp="2025-12-08 20:38:53 +0000 UTC" firstStartedPulling="2025-12-08 20:38:54.263134825 +0000 UTC m=+2050.414418202" lastFinishedPulling="2025-12-08 20:38:54.656503078 +0000 UTC m=+2050.807786455" observedRunningTime="2025-12-08 20:38:55.327540592 +0000 UTC m=+2051.478823969" watchObservedRunningTime="2025-12-08 20:38:55.331548047 +0000 UTC m=+2051.482831424" Dec 08 20:39:41 crc kubenswrapper[4781]: I1208 20:39:41.781604 4781 generic.go:334] "Generic (PLEG): container finished" podID="f0d82d95-8bf8-4845-a305-cca05358ffdb" containerID="f5465a042561df059599878fd32f6a2340173a4518291e1936b2e2563cce5625" exitCode=0 Dec 08 20:39:41 crc kubenswrapper[4781]: I1208 20:39:41.781737 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" event={"ID":"f0d82d95-8bf8-4845-a305-cca05358ffdb","Type":"ContainerDied","Data":"f5465a042561df059599878fd32f6a2340173a4518291e1936b2e2563cce5625"} Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.211070 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.304987 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-nova-metadata-neutron-config-0\") pod \"f0d82d95-8bf8-4845-a305-cca05358ffdb\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.305082 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f0d82d95-8bf8-4845-a305-cca05358ffdb\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.305121 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-neutron-metadata-combined-ca-bundle\") pod \"f0d82d95-8bf8-4845-a305-cca05358ffdb\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.305218 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-ssh-key\") pod \"f0d82d95-8bf8-4845-a305-cca05358ffdb\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.305254 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-inventory\") pod \"f0d82d95-8bf8-4845-a305-cca05358ffdb\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.305327 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hmxx\" (UniqueName: \"kubernetes.io/projected/f0d82d95-8bf8-4845-a305-cca05358ffdb-kube-api-access-6hmxx\") pod \"f0d82d95-8bf8-4845-a305-cca05358ffdb\" (UID: \"f0d82d95-8bf8-4845-a305-cca05358ffdb\") " Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.319340 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f0d82d95-8bf8-4845-a305-cca05358ffdb" (UID: "f0d82d95-8bf8-4845-a305-cca05358ffdb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.319400 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0d82d95-8bf8-4845-a305-cca05358ffdb-kube-api-access-6hmxx" (OuterVolumeSpecName: "kube-api-access-6hmxx") pod "f0d82d95-8bf8-4845-a305-cca05358ffdb" (UID: "f0d82d95-8bf8-4845-a305-cca05358ffdb"). InnerVolumeSpecName "kube-api-access-6hmxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.339712 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f0d82d95-8bf8-4845-a305-cca05358ffdb" (UID: "f0d82d95-8bf8-4845-a305-cca05358ffdb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.342030 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f0d82d95-8bf8-4845-a305-cca05358ffdb" (UID: "f0d82d95-8bf8-4845-a305-cca05358ffdb"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.369401 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f0d82d95-8bf8-4845-a305-cca05358ffdb" (UID: "f0d82d95-8bf8-4845-a305-cca05358ffdb"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.371056 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-inventory" (OuterVolumeSpecName: "inventory") pod "f0d82d95-8bf8-4845-a305-cca05358ffdb" (UID: "f0d82d95-8bf8-4845-a305-cca05358ffdb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.407598 4781 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.407642 4781 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.407658 4781 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.407671 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.407682 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f0d82d95-8bf8-4845-a305-cca05358ffdb-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.407693 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hmxx\" (UniqueName: \"kubernetes.io/projected/f0d82d95-8bf8-4845-a305-cca05358ffdb-kube-api-access-6hmxx\") on node \"crc\" DevicePath \"\"" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.806080 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" event={"ID":"f0d82d95-8bf8-4845-a305-cca05358ffdb","Type":"ContainerDied","Data":"5940c407f9da7b6abae2fae73c3efdc651a639bee867c96a4158fbdcab8c1fed"} Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.806154 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.806160 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5940c407f9da7b6abae2fae73c3efdc651a639bee867c96a4158fbdcab8c1fed" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.903931 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk"] Dec 08 20:39:43 crc kubenswrapper[4781]: E1208 20:39:43.904636 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0d82d95-8bf8-4845-a305-cca05358ffdb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.904710 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0d82d95-8bf8-4845-a305-cca05358ffdb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.904960 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0d82d95-8bf8-4845-a305-cca05358ffdb" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.905737 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.907832 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.907984 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.908103 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.910617 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.910851 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:39:43 crc kubenswrapper[4781]: I1208 20:39:43.916856 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk"] Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.020981 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.021044 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.021102 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.021307 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb99z\" (UniqueName: \"kubernetes.io/projected/0d819560-5e37-4cbe-8276-f5c63dd9610c-kube-api-access-lb99z\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.021588 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.122855 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.123240 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.123374 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.123520 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.123624 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb99z\" (UniqueName: \"kubernetes.io/projected/0d819560-5e37-4cbe-8276-f5c63dd9610c-kube-api-access-lb99z\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.131649 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.131771 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.132499 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.139795 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.152441 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb99z\" (UniqueName: \"kubernetes.io/projected/0d819560-5e37-4cbe-8276-f5c63dd9610c-kube-api-access-lb99z\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-496zk\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.227893 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.236137 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.748486 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk"] Dec 08 20:39:44 crc kubenswrapper[4781]: W1208 20:39:44.756528 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d819560_5e37_4cbe_8276_f5c63dd9610c.slice/crio-f6149d12238222b73d433dceb8c679374a8938af26bb66bff898bf8cc74b8b7f WatchSource:0}: Error finding container f6149d12238222b73d433dceb8c679374a8938af26bb66bff898bf8cc74b8b7f: Status 404 returned error can't find the container with id f6149d12238222b73d433dceb8c679374a8938af26bb66bff898bf8cc74b8b7f Dec 08 20:39:44 crc kubenswrapper[4781]: I1208 20:39:44.816319 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" event={"ID":"0d819560-5e37-4cbe-8276-f5c63dd9610c","Type":"ContainerStarted","Data":"f6149d12238222b73d433dceb8c679374a8938af26bb66bff898bf8cc74b8b7f"} Dec 08 20:39:45 crc kubenswrapper[4781]: I1208 20:39:45.211149 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:39:45 crc kubenswrapper[4781]: I1208 20:39:45.826036 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" event={"ID":"0d819560-5e37-4cbe-8276-f5c63dd9610c","Type":"ContainerStarted","Data":"a53f09a40800a827e4bce54f8971574d6526712fa20035334414667a13e0d78d"} Dec 08 20:39:45 crc kubenswrapper[4781]: I1208 20:39:45.853011 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" podStartSLOduration=2.404982652 podStartE2EDuration="2.852983556s" podCreationTimestamp="2025-12-08 20:39:43 +0000 UTC" firstStartedPulling="2025-12-08 20:39:44.759325838 +0000 UTC m=+2100.910609215" lastFinishedPulling="2025-12-08 20:39:45.207326742 +0000 UTC m=+2101.358610119" observedRunningTime="2025-12-08 20:39:45.840873737 +0000 UTC m=+2101.992157124" watchObservedRunningTime="2025-12-08 20:39:45.852983556 +0000 UTC m=+2102.004266933" Dec 08 20:40:29 crc kubenswrapper[4781]: I1208 20:40:29.947967 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:40:29 crc kubenswrapper[4781]: I1208 20:40:29.948445 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.075293 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7swzk"] Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.078010 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.089695 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7swzk"] Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.155764 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06cf0885-29b6-4ffc-905f-c2962bf52823-utilities\") pod \"community-operators-7swzk\" (UID: \"06cf0885-29b6-4ffc-905f-c2962bf52823\") " pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.156205 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f559q\" (UniqueName: \"kubernetes.io/projected/06cf0885-29b6-4ffc-905f-c2962bf52823-kube-api-access-f559q\") pod \"community-operators-7swzk\" (UID: \"06cf0885-29b6-4ffc-905f-c2962bf52823\") " pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.156247 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06cf0885-29b6-4ffc-905f-c2962bf52823-catalog-content\") pod \"community-operators-7swzk\" (UID: \"06cf0885-29b6-4ffc-905f-c2962bf52823\") " pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.259949 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06cf0885-29b6-4ffc-905f-c2962bf52823-catalog-content\") pod \"community-operators-7swzk\" (UID: \"06cf0885-29b6-4ffc-905f-c2962bf52823\") " pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.260183 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06cf0885-29b6-4ffc-905f-c2962bf52823-utilities\") pod \"community-operators-7swzk\" (UID: \"06cf0885-29b6-4ffc-905f-c2962bf52823\") " pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.260244 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f559q\" (UniqueName: \"kubernetes.io/projected/06cf0885-29b6-4ffc-905f-c2962bf52823-kube-api-access-f559q\") pod \"community-operators-7swzk\" (UID: \"06cf0885-29b6-4ffc-905f-c2962bf52823\") " pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.260628 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06cf0885-29b6-4ffc-905f-c2962bf52823-catalog-content\") pod \"community-operators-7swzk\" (UID: \"06cf0885-29b6-4ffc-905f-c2962bf52823\") " pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.260722 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06cf0885-29b6-4ffc-905f-c2962bf52823-utilities\") pod \"community-operators-7swzk\" (UID: \"06cf0885-29b6-4ffc-905f-c2962bf52823\") " pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.286076 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f559q\" (UniqueName: \"kubernetes.io/projected/06cf0885-29b6-4ffc-905f-c2962bf52823-kube-api-access-f559q\") pod \"community-operators-7swzk\" (UID: \"06cf0885-29b6-4ffc-905f-c2962bf52823\") " pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.447354 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.948310 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.948606 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:40:59 crc kubenswrapper[4781]: I1208 20:40:59.960349 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7swzk"] Dec 08 20:41:00 crc kubenswrapper[4781]: I1208 20:41:00.532237 4781 generic.go:334] "Generic (PLEG): container finished" podID="06cf0885-29b6-4ffc-905f-c2962bf52823" containerID="9f71bdb399bd9b5e38f577b7341bfdc39ddec682dc257a5500b467c8fd7ec429" exitCode=0 Dec 08 20:41:00 crc kubenswrapper[4781]: I1208 20:41:00.532516 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7swzk" event={"ID":"06cf0885-29b6-4ffc-905f-c2962bf52823","Type":"ContainerDied","Data":"9f71bdb399bd9b5e38f577b7341bfdc39ddec682dc257a5500b467c8fd7ec429"} Dec 08 20:41:00 crc kubenswrapper[4781]: I1208 20:41:00.532943 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7swzk" event={"ID":"06cf0885-29b6-4ffc-905f-c2962bf52823","Type":"ContainerStarted","Data":"4b055037774a26465f005d98b6be2c1a237c1bf4700b643aabf8aaf40ce4d2d0"} Dec 08 20:41:01 crc kubenswrapper[4781]: I1208 20:41:01.543704 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7swzk" event={"ID":"06cf0885-29b6-4ffc-905f-c2962bf52823","Type":"ContainerStarted","Data":"be04ee70b02a562cb63d2cc5dff7c23f1aaeb03828be41858969fa39cc85a763"} Dec 08 20:41:02 crc kubenswrapper[4781]: I1208 20:41:02.555248 4781 generic.go:334] "Generic (PLEG): container finished" podID="06cf0885-29b6-4ffc-905f-c2962bf52823" containerID="be04ee70b02a562cb63d2cc5dff7c23f1aaeb03828be41858969fa39cc85a763" exitCode=0 Dec 08 20:41:02 crc kubenswrapper[4781]: I1208 20:41:02.555338 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7swzk" event={"ID":"06cf0885-29b6-4ffc-905f-c2962bf52823","Type":"ContainerDied","Data":"be04ee70b02a562cb63d2cc5dff7c23f1aaeb03828be41858969fa39cc85a763"} Dec 08 20:41:03 crc kubenswrapper[4781]: I1208 20:41:03.568461 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7swzk" event={"ID":"06cf0885-29b6-4ffc-905f-c2962bf52823","Type":"ContainerStarted","Data":"6703de76bbacfec9f11258388f1ad2461b344f1e783417c4528e9c7f55451860"} Dec 08 20:41:03 crc kubenswrapper[4781]: I1208 20:41:03.590430 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7swzk" podStartSLOduration=2.185370536 podStartE2EDuration="4.59040504s" podCreationTimestamp="2025-12-08 20:40:59 +0000 UTC" firstStartedPulling="2025-12-08 20:41:00.534081337 +0000 UTC m=+2176.685364724" lastFinishedPulling="2025-12-08 20:41:02.939115851 +0000 UTC m=+2179.090399228" observedRunningTime="2025-12-08 20:41:03.586374504 +0000 UTC m=+2179.737657891" watchObservedRunningTime="2025-12-08 20:41:03.59040504 +0000 UTC m=+2179.741688417" Dec 08 20:41:09 crc kubenswrapper[4781]: I1208 20:41:09.453518 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:41:09 crc kubenswrapper[4781]: I1208 20:41:09.454170 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:41:09 crc kubenswrapper[4781]: I1208 20:41:09.509768 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:41:09 crc kubenswrapper[4781]: I1208 20:41:09.668982 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:41:09 crc kubenswrapper[4781]: I1208 20:41:09.746880 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7swzk"] Dec 08 20:41:11 crc kubenswrapper[4781]: I1208 20:41:11.634282 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7swzk" podUID="06cf0885-29b6-4ffc-905f-c2962bf52823" containerName="registry-server" containerID="cri-o://6703de76bbacfec9f11258388f1ad2461b344f1e783417c4528e9c7f55451860" gracePeriod=2 Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.091975 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.208704 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06cf0885-29b6-4ffc-905f-c2962bf52823-utilities\") pod \"06cf0885-29b6-4ffc-905f-c2962bf52823\" (UID: \"06cf0885-29b6-4ffc-905f-c2962bf52823\") " Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.208815 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f559q\" (UniqueName: \"kubernetes.io/projected/06cf0885-29b6-4ffc-905f-c2962bf52823-kube-api-access-f559q\") pod \"06cf0885-29b6-4ffc-905f-c2962bf52823\" (UID: \"06cf0885-29b6-4ffc-905f-c2962bf52823\") " Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.210045 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06cf0885-29b6-4ffc-905f-c2962bf52823-utilities" (OuterVolumeSpecName: "utilities") pod "06cf0885-29b6-4ffc-905f-c2962bf52823" (UID: "06cf0885-29b6-4ffc-905f-c2962bf52823"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.210181 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06cf0885-29b6-4ffc-905f-c2962bf52823-catalog-content\") pod \"06cf0885-29b6-4ffc-905f-c2962bf52823\" (UID: \"06cf0885-29b6-4ffc-905f-c2962bf52823\") " Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.211328 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/06cf0885-29b6-4ffc-905f-c2962bf52823-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.214608 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06cf0885-29b6-4ffc-905f-c2962bf52823-kube-api-access-f559q" (OuterVolumeSpecName: "kube-api-access-f559q") pod "06cf0885-29b6-4ffc-905f-c2962bf52823" (UID: "06cf0885-29b6-4ffc-905f-c2962bf52823"). InnerVolumeSpecName "kube-api-access-f559q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.306708 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06cf0885-29b6-4ffc-905f-c2962bf52823-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "06cf0885-29b6-4ffc-905f-c2962bf52823" (UID: "06cf0885-29b6-4ffc-905f-c2962bf52823"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.313357 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f559q\" (UniqueName: \"kubernetes.io/projected/06cf0885-29b6-4ffc-905f-c2962bf52823-kube-api-access-f559q\") on node \"crc\" DevicePath \"\"" Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.313436 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/06cf0885-29b6-4ffc-905f-c2962bf52823-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.644806 4781 generic.go:334] "Generic (PLEG): container finished" podID="06cf0885-29b6-4ffc-905f-c2962bf52823" containerID="6703de76bbacfec9f11258388f1ad2461b344f1e783417c4528e9c7f55451860" exitCode=0 Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.644856 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7swzk" event={"ID":"06cf0885-29b6-4ffc-905f-c2962bf52823","Type":"ContainerDied","Data":"6703de76bbacfec9f11258388f1ad2461b344f1e783417c4528e9c7f55451860"} Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.644890 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7swzk" event={"ID":"06cf0885-29b6-4ffc-905f-c2962bf52823","Type":"ContainerDied","Data":"4b055037774a26465f005d98b6be2c1a237c1bf4700b643aabf8aaf40ce4d2d0"} Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.644931 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7swzk" Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.644994 4781 scope.go:117] "RemoveContainer" containerID="6703de76bbacfec9f11258388f1ad2461b344f1e783417c4528e9c7f55451860" Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.664820 4781 scope.go:117] "RemoveContainer" containerID="be04ee70b02a562cb63d2cc5dff7c23f1aaeb03828be41858969fa39cc85a763" Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.689689 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7swzk"] Dec 08 20:41:12 crc kubenswrapper[4781]: I1208 20:41:12.700114 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7swzk"] Dec 08 20:41:13 crc kubenswrapper[4781]: I1208 20:41:13.575724 4781 scope.go:117] "RemoveContainer" containerID="9f71bdb399bd9b5e38f577b7341bfdc39ddec682dc257a5500b467c8fd7ec429" Dec 08 20:41:13 crc kubenswrapper[4781]: I1208 20:41:13.622131 4781 scope.go:117] "RemoveContainer" containerID="6703de76bbacfec9f11258388f1ad2461b344f1e783417c4528e9c7f55451860" Dec 08 20:41:13 crc kubenswrapper[4781]: E1208 20:41:13.622556 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6703de76bbacfec9f11258388f1ad2461b344f1e783417c4528e9c7f55451860\": container with ID starting with 6703de76bbacfec9f11258388f1ad2461b344f1e783417c4528e9c7f55451860 not found: ID does not exist" containerID="6703de76bbacfec9f11258388f1ad2461b344f1e783417c4528e9c7f55451860" Dec 08 20:41:13 crc kubenswrapper[4781]: I1208 20:41:13.622589 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6703de76bbacfec9f11258388f1ad2461b344f1e783417c4528e9c7f55451860"} err="failed to get container status \"6703de76bbacfec9f11258388f1ad2461b344f1e783417c4528e9c7f55451860\": rpc error: code = NotFound desc = could not find container \"6703de76bbacfec9f11258388f1ad2461b344f1e783417c4528e9c7f55451860\": container with ID starting with 6703de76bbacfec9f11258388f1ad2461b344f1e783417c4528e9c7f55451860 not found: ID does not exist" Dec 08 20:41:13 crc kubenswrapper[4781]: I1208 20:41:13.622609 4781 scope.go:117] "RemoveContainer" containerID="be04ee70b02a562cb63d2cc5dff7c23f1aaeb03828be41858969fa39cc85a763" Dec 08 20:41:13 crc kubenswrapper[4781]: E1208 20:41:13.623104 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be04ee70b02a562cb63d2cc5dff7c23f1aaeb03828be41858969fa39cc85a763\": container with ID starting with be04ee70b02a562cb63d2cc5dff7c23f1aaeb03828be41858969fa39cc85a763 not found: ID does not exist" containerID="be04ee70b02a562cb63d2cc5dff7c23f1aaeb03828be41858969fa39cc85a763" Dec 08 20:41:13 crc kubenswrapper[4781]: I1208 20:41:13.623128 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be04ee70b02a562cb63d2cc5dff7c23f1aaeb03828be41858969fa39cc85a763"} err="failed to get container status \"be04ee70b02a562cb63d2cc5dff7c23f1aaeb03828be41858969fa39cc85a763\": rpc error: code = NotFound desc = could not find container \"be04ee70b02a562cb63d2cc5dff7c23f1aaeb03828be41858969fa39cc85a763\": container with ID starting with be04ee70b02a562cb63d2cc5dff7c23f1aaeb03828be41858969fa39cc85a763 not found: ID does not exist" Dec 08 20:41:13 crc kubenswrapper[4781]: I1208 20:41:13.623141 4781 scope.go:117] "RemoveContainer" containerID="9f71bdb399bd9b5e38f577b7341bfdc39ddec682dc257a5500b467c8fd7ec429" Dec 08 20:41:13 crc kubenswrapper[4781]: E1208 20:41:13.624034 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f71bdb399bd9b5e38f577b7341bfdc39ddec682dc257a5500b467c8fd7ec429\": container with ID starting with 9f71bdb399bd9b5e38f577b7341bfdc39ddec682dc257a5500b467c8fd7ec429 not found: ID does not exist" containerID="9f71bdb399bd9b5e38f577b7341bfdc39ddec682dc257a5500b467c8fd7ec429" Dec 08 20:41:13 crc kubenswrapper[4781]: I1208 20:41:13.624061 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f71bdb399bd9b5e38f577b7341bfdc39ddec682dc257a5500b467c8fd7ec429"} err="failed to get container status \"9f71bdb399bd9b5e38f577b7341bfdc39ddec682dc257a5500b467c8fd7ec429\": rpc error: code = NotFound desc = could not find container \"9f71bdb399bd9b5e38f577b7341bfdc39ddec682dc257a5500b467c8fd7ec429\": container with ID starting with 9f71bdb399bd9b5e38f577b7341bfdc39ddec682dc257a5500b467c8fd7ec429 not found: ID does not exist" Dec 08 20:41:14 crc kubenswrapper[4781]: I1208 20:41:14.136995 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06cf0885-29b6-4ffc-905f-c2962bf52823" path="/var/lib/kubelet/pods/06cf0885-29b6-4ffc-905f-c2962bf52823/volumes" Dec 08 20:41:29 crc kubenswrapper[4781]: I1208 20:41:29.948698 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:41:29 crc kubenswrapper[4781]: I1208 20:41:29.949352 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:41:29 crc kubenswrapper[4781]: I1208 20:41:29.949410 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:41:29 crc kubenswrapper[4781]: I1208 20:41:29.950362 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ecf2f2c52e8a643c4ad224bdcf72c4081cef5a126f0f1cdf084fd2fe65b6b95"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 20:41:29 crc kubenswrapper[4781]: I1208 20:41:29.950454 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://4ecf2f2c52e8a643c4ad224bdcf72c4081cef5a126f0f1cdf084fd2fe65b6b95" gracePeriod=600 Dec 08 20:41:30 crc kubenswrapper[4781]: I1208 20:41:30.843028 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="4ecf2f2c52e8a643c4ad224bdcf72c4081cef5a126f0f1cdf084fd2fe65b6b95" exitCode=0 Dec 08 20:41:30 crc kubenswrapper[4781]: I1208 20:41:30.843192 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"4ecf2f2c52e8a643c4ad224bdcf72c4081cef5a126f0f1cdf084fd2fe65b6b95"} Dec 08 20:41:30 crc kubenswrapper[4781]: I1208 20:41:30.843686 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465"} Dec 08 20:41:30 crc kubenswrapper[4781]: I1208 20:41:30.843715 4781 scope.go:117] "RemoveContainer" containerID="18b11ff75f5824cb07f8a62bd89b5ff1b053992e0e618bebd3115f0bba2ac68c" Dec 08 20:43:54 crc kubenswrapper[4781]: I1208 20:43:54.185331 4781 generic.go:334] "Generic (PLEG): container finished" podID="0d819560-5e37-4cbe-8276-f5c63dd9610c" containerID="a53f09a40800a827e4bce54f8971574d6526712fa20035334414667a13e0d78d" exitCode=0 Dec 08 20:43:54 crc kubenswrapper[4781]: I1208 20:43:54.185423 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" event={"ID":"0d819560-5e37-4cbe-8276-f5c63dd9610c","Type":"ContainerDied","Data":"a53f09a40800a827e4bce54f8971574d6526712fa20035334414667a13e0d78d"} Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.143489 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nfvpd"] Dec 08 20:43:55 crc kubenswrapper[4781]: E1208 20:43:55.144023 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06cf0885-29b6-4ffc-905f-c2962bf52823" containerName="extract-utilities" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.144048 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="06cf0885-29b6-4ffc-905f-c2962bf52823" containerName="extract-utilities" Dec 08 20:43:55 crc kubenswrapper[4781]: E1208 20:43:55.144081 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06cf0885-29b6-4ffc-905f-c2962bf52823" containerName="extract-content" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.144090 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="06cf0885-29b6-4ffc-905f-c2962bf52823" containerName="extract-content" Dec 08 20:43:55 crc kubenswrapper[4781]: E1208 20:43:55.144105 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06cf0885-29b6-4ffc-905f-c2962bf52823" containerName="registry-server" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.144112 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="06cf0885-29b6-4ffc-905f-c2962bf52823" containerName="registry-server" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.144347 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="06cf0885-29b6-4ffc-905f-c2962bf52823" containerName="registry-server" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.146263 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.168442 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfvpd"] Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.170400 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb4wz\" (UniqueName: \"kubernetes.io/projected/6324d715-b7d1-40d9-8164-d4efd09085cc-kube-api-access-cb4wz\") pod \"redhat-marketplace-nfvpd\" (UID: \"6324d715-b7d1-40d9-8164-d4efd09085cc\") " pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.170835 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6324d715-b7d1-40d9-8164-d4efd09085cc-catalog-content\") pod \"redhat-marketplace-nfvpd\" (UID: \"6324d715-b7d1-40d9-8164-d4efd09085cc\") " pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.170989 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6324d715-b7d1-40d9-8164-d4efd09085cc-utilities\") pod \"redhat-marketplace-nfvpd\" (UID: \"6324d715-b7d1-40d9-8164-d4efd09085cc\") " pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.272802 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb4wz\" (UniqueName: \"kubernetes.io/projected/6324d715-b7d1-40d9-8164-d4efd09085cc-kube-api-access-cb4wz\") pod \"redhat-marketplace-nfvpd\" (UID: \"6324d715-b7d1-40d9-8164-d4efd09085cc\") " pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.273303 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6324d715-b7d1-40d9-8164-d4efd09085cc-catalog-content\") pod \"redhat-marketplace-nfvpd\" (UID: \"6324d715-b7d1-40d9-8164-d4efd09085cc\") " pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.273335 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6324d715-b7d1-40d9-8164-d4efd09085cc-utilities\") pod \"redhat-marketplace-nfvpd\" (UID: \"6324d715-b7d1-40d9-8164-d4efd09085cc\") " pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.274354 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6324d715-b7d1-40d9-8164-d4efd09085cc-utilities\") pod \"redhat-marketplace-nfvpd\" (UID: \"6324d715-b7d1-40d9-8164-d4efd09085cc\") " pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.275330 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6324d715-b7d1-40d9-8164-d4efd09085cc-catalog-content\") pod \"redhat-marketplace-nfvpd\" (UID: \"6324d715-b7d1-40d9-8164-d4efd09085cc\") " pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.298055 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb4wz\" (UniqueName: \"kubernetes.io/projected/6324d715-b7d1-40d9-8164-d4efd09085cc-kube-api-access-cb4wz\") pod \"redhat-marketplace-nfvpd\" (UID: \"6324d715-b7d1-40d9-8164-d4efd09085cc\") " pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.474568 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.604630 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.783685 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb99z\" (UniqueName: \"kubernetes.io/projected/0d819560-5e37-4cbe-8276-f5c63dd9610c-kube-api-access-lb99z\") pod \"0d819560-5e37-4cbe-8276-f5c63dd9610c\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.784810 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-ssh-key\") pod \"0d819560-5e37-4cbe-8276-f5c63dd9610c\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.784858 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-libvirt-secret-0\") pod \"0d819560-5e37-4cbe-8276-f5c63dd9610c\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.785011 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-inventory\") pod \"0d819560-5e37-4cbe-8276-f5c63dd9610c\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.785730 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-libvirt-combined-ca-bundle\") pod \"0d819560-5e37-4cbe-8276-f5c63dd9610c\" (UID: \"0d819560-5e37-4cbe-8276-f5c63dd9610c\") " Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.793892 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "0d819560-5e37-4cbe-8276-f5c63dd9610c" (UID: "0d819560-5e37-4cbe-8276-f5c63dd9610c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.795133 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d819560-5e37-4cbe-8276-f5c63dd9610c-kube-api-access-lb99z" (OuterVolumeSpecName: "kube-api-access-lb99z") pod "0d819560-5e37-4cbe-8276-f5c63dd9610c" (UID: "0d819560-5e37-4cbe-8276-f5c63dd9610c"). InnerVolumeSpecName "kube-api-access-lb99z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.817844 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0d819560-5e37-4cbe-8276-f5c63dd9610c" (UID: "0d819560-5e37-4cbe-8276-f5c63dd9610c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.819649 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "0d819560-5e37-4cbe-8276-f5c63dd9610c" (UID: "0d819560-5e37-4cbe-8276-f5c63dd9610c"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.841259 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-inventory" (OuterVolumeSpecName: "inventory") pod "0d819560-5e37-4cbe-8276-f5c63dd9610c" (UID: "0d819560-5e37-4cbe-8276-f5c63dd9610c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.887625 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.887674 4781 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.887689 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb99z\" (UniqueName: \"kubernetes.io/projected/0d819560-5e37-4cbe-8276-f5c63dd9610c-kube-api-access-lb99z\") on node \"crc\" DevicePath \"\"" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.887699 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:43:55 crc kubenswrapper[4781]: I1208 20:43:55.887714 4781 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/0d819560-5e37-4cbe-8276-f5c63dd9610c-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.076035 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfvpd"] Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.203618 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" event={"ID":"0d819560-5e37-4cbe-8276-f5c63dd9610c","Type":"ContainerDied","Data":"f6149d12238222b73d433dceb8c679374a8938af26bb66bff898bf8cc74b8b7f"} Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.203703 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6149d12238222b73d433dceb8c679374a8938af26bb66bff898bf8cc74b8b7f" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.203764 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-496zk" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.208041 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfvpd" event={"ID":"6324d715-b7d1-40d9-8164-d4efd09085cc","Type":"ContainerStarted","Data":"ca0b691090fbdb4b27d6f4bd88858315322deb8487954134d2493f177a0a9d4e"} Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.306065 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg"] Dec 08 20:43:56 crc kubenswrapper[4781]: E1208 20:43:56.306560 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d819560-5e37-4cbe-8276-f5c63dd9610c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.306579 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d819560-5e37-4cbe-8276-f5c63dd9610c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.306844 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d819560-5e37-4cbe-8276-f5c63dd9610c" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.307726 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.311493 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.312839 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.312840 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.312888 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.314273 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.314408 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.314720 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg"] Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.315690 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.401282 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.401625 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.401689 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.401711 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w9xc\" (UniqueName: \"kubernetes.io/projected/69ce8819-1a24-4b28-9438-c92c07b4dbca-kube-api-access-6w9xc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.401804 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.401843 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.401891 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.401973 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.401996 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.503946 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.503981 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w9xc\" (UniqueName: \"kubernetes.io/projected/69ce8819-1a24-4b28-9438-c92c07b4dbca-kube-api-access-6w9xc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.504027 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.504045 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.504082 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.504102 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.504119 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.504180 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.504205 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.505754 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.510504 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.510597 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.511008 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.512431 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.512526 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.516359 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.516724 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.538049 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w9xc\" (UniqueName: \"kubernetes.io/projected/69ce8819-1a24-4b28-9438-c92c07b4dbca-kube-api-access-6w9xc\") pod \"nova-edpm-deployment-openstack-edpm-ipam-5rsvg\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.564387 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nbkfc"] Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.567522 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.568096 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nbkfc"] Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.606316 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/708d3f33-55ef-4c25-997f-5fd13a23c7d0-catalog-content\") pod \"redhat-operators-nbkfc\" (UID: \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\") " pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.606479 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pv6k\" (UniqueName: \"kubernetes.io/projected/708d3f33-55ef-4c25-997f-5fd13a23c7d0-kube-api-access-2pv6k\") pod \"redhat-operators-nbkfc\" (UID: \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\") " pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.606681 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/708d3f33-55ef-4c25-997f-5fd13a23c7d0-utilities\") pod \"redhat-operators-nbkfc\" (UID: \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\") " pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.683945 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.710270 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pv6k\" (UniqueName: \"kubernetes.io/projected/708d3f33-55ef-4c25-997f-5fd13a23c7d0-kube-api-access-2pv6k\") pod \"redhat-operators-nbkfc\" (UID: \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\") " pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.710636 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/708d3f33-55ef-4c25-997f-5fd13a23c7d0-utilities\") pod \"redhat-operators-nbkfc\" (UID: \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\") " pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.710783 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/708d3f33-55ef-4c25-997f-5fd13a23c7d0-catalog-content\") pod \"redhat-operators-nbkfc\" (UID: \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\") " pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.711449 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/708d3f33-55ef-4c25-997f-5fd13a23c7d0-catalog-content\") pod \"redhat-operators-nbkfc\" (UID: \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\") " pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.711485 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/708d3f33-55ef-4c25-997f-5fd13a23c7d0-utilities\") pod \"redhat-operators-nbkfc\" (UID: \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\") " pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.744044 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pv6k\" (UniqueName: \"kubernetes.io/projected/708d3f33-55ef-4c25-997f-5fd13a23c7d0-kube-api-access-2pv6k\") pod \"redhat-operators-nbkfc\" (UID: \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\") " pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:43:56 crc kubenswrapper[4781]: I1208 20:43:56.926592 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:43:57 crc kubenswrapper[4781]: I1208 20:43:57.221029 4781 generic.go:334] "Generic (PLEG): container finished" podID="6324d715-b7d1-40d9-8164-d4efd09085cc" containerID="13a2b0e374566e79a1742fa88f68b1eda4a0eda4e30b8256db428237b5292dd6" exitCode=0 Dec 08 20:43:57 crc kubenswrapper[4781]: I1208 20:43:57.221301 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfvpd" event={"ID":"6324d715-b7d1-40d9-8164-d4efd09085cc","Type":"ContainerDied","Data":"13a2b0e374566e79a1742fa88f68b1eda4a0eda4e30b8256db428237b5292dd6"} Dec 08 20:43:57 crc kubenswrapper[4781]: I1208 20:43:57.225263 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 20:43:57 crc kubenswrapper[4781]: I1208 20:43:57.272011 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg"] Dec 08 20:43:57 crc kubenswrapper[4781]: W1208 20:43:57.275859 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69ce8819_1a24_4b28_9438_c92c07b4dbca.slice/crio-0d4674cee71310bbd3c56d7cefc79fb3a912e71103a7f7d3850eecef3b26f049 WatchSource:0}: Error finding container 0d4674cee71310bbd3c56d7cefc79fb3a912e71103a7f7d3850eecef3b26f049: Status 404 returned error can't find the container with id 0d4674cee71310bbd3c56d7cefc79fb3a912e71103a7f7d3850eecef3b26f049 Dec 08 20:43:57 crc kubenswrapper[4781]: W1208 20:43:57.394883 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod708d3f33_55ef_4c25_997f_5fd13a23c7d0.slice/crio-250b0e8b00f2227821c27a9402d91393fabea4dbc671a35448435ad19cecb622 WatchSource:0}: Error finding container 250b0e8b00f2227821c27a9402d91393fabea4dbc671a35448435ad19cecb622: Status 404 returned error can't find the container with id 250b0e8b00f2227821c27a9402d91393fabea4dbc671a35448435ad19cecb622 Dec 08 20:43:57 crc kubenswrapper[4781]: I1208 20:43:57.399345 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nbkfc"] Dec 08 20:43:58 crc kubenswrapper[4781]: I1208 20:43:58.234403 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" event={"ID":"69ce8819-1a24-4b28-9438-c92c07b4dbca","Type":"ContainerStarted","Data":"4983761f2c57d0281c3ba9769cca57be3eb37e147e5c0cea0db36ec7089dd83c"} Dec 08 20:43:58 crc kubenswrapper[4781]: I1208 20:43:58.235022 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" event={"ID":"69ce8819-1a24-4b28-9438-c92c07b4dbca","Type":"ContainerStarted","Data":"0d4674cee71310bbd3c56d7cefc79fb3a912e71103a7f7d3850eecef3b26f049"} Dec 08 20:43:58 crc kubenswrapper[4781]: I1208 20:43:58.237225 4781 generic.go:334] "Generic (PLEG): container finished" podID="708d3f33-55ef-4c25-997f-5fd13a23c7d0" containerID="621a7d3a5de1a4f0ebe761031cdc3eb5999d00ea2f798e4443dab5690e1ae199" exitCode=0 Dec 08 20:43:58 crc kubenswrapper[4781]: I1208 20:43:58.237566 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbkfc" event={"ID":"708d3f33-55ef-4c25-997f-5fd13a23c7d0","Type":"ContainerDied","Data":"621a7d3a5de1a4f0ebe761031cdc3eb5999d00ea2f798e4443dab5690e1ae199"} Dec 08 20:43:58 crc kubenswrapper[4781]: I1208 20:43:58.237616 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbkfc" event={"ID":"708d3f33-55ef-4c25-997f-5fd13a23c7d0","Type":"ContainerStarted","Data":"250b0e8b00f2227821c27a9402d91393fabea4dbc671a35448435ad19cecb622"} Dec 08 20:43:58 crc kubenswrapper[4781]: I1208 20:43:58.240711 4781 generic.go:334] "Generic (PLEG): container finished" podID="6324d715-b7d1-40d9-8164-d4efd09085cc" containerID="a4f8ab8824ef5e7bc705015d549e6ffd466bb519353465b1ab2f05f569fe6628" exitCode=0 Dec 08 20:43:58 crc kubenswrapper[4781]: I1208 20:43:58.240743 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfvpd" event={"ID":"6324d715-b7d1-40d9-8164-d4efd09085cc","Type":"ContainerDied","Data":"a4f8ab8824ef5e7bc705015d549e6ffd466bb519353465b1ab2f05f569fe6628"} Dec 08 20:43:58 crc kubenswrapper[4781]: I1208 20:43:58.265742 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" podStartSLOduration=1.7203888410000001 podStartE2EDuration="2.265715159s" podCreationTimestamp="2025-12-08 20:43:56 +0000 UTC" firstStartedPulling="2025-12-08 20:43:57.278314865 +0000 UTC m=+2353.429598242" lastFinishedPulling="2025-12-08 20:43:57.823641173 +0000 UTC m=+2353.974924560" observedRunningTime="2025-12-08 20:43:58.252380366 +0000 UTC m=+2354.403663743" watchObservedRunningTime="2025-12-08 20:43:58.265715159 +0000 UTC m=+2354.416998536" Dec 08 20:43:59 crc kubenswrapper[4781]: I1208 20:43:59.250935 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfvpd" event={"ID":"6324d715-b7d1-40d9-8164-d4efd09085cc","Type":"ContainerStarted","Data":"b9bcce25243cb12c7362f1c4a6c6af614e2ba26b4ec0567615e23bb2f87f7338"} Dec 08 20:43:59 crc kubenswrapper[4781]: I1208 20:43:59.254088 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbkfc" event={"ID":"708d3f33-55ef-4c25-997f-5fd13a23c7d0","Type":"ContainerStarted","Data":"b34b9b2def69db7dcf00a713d5a3c834f9b3c9593d921e8148f3c29a24092fd5"} Dec 08 20:43:59 crc kubenswrapper[4781]: I1208 20:43:59.273043 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nfvpd" podStartSLOduration=2.857953707 podStartE2EDuration="4.273009303s" podCreationTimestamp="2025-12-08 20:43:55 +0000 UTC" firstStartedPulling="2025-12-08 20:43:57.224940553 +0000 UTC m=+2353.376223940" lastFinishedPulling="2025-12-08 20:43:58.639996139 +0000 UTC m=+2354.791279536" observedRunningTime="2025-12-08 20:43:59.270979315 +0000 UTC m=+2355.422262692" watchObservedRunningTime="2025-12-08 20:43:59.273009303 +0000 UTC m=+2355.424292680" Dec 08 20:43:59 crc kubenswrapper[4781]: I1208 20:43:59.948402 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:43:59 crc kubenswrapper[4781]: I1208 20:43:59.948759 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:44:02 crc kubenswrapper[4781]: I1208 20:44:02.285985 4781 generic.go:334] "Generic (PLEG): container finished" podID="708d3f33-55ef-4c25-997f-5fd13a23c7d0" containerID="b34b9b2def69db7dcf00a713d5a3c834f9b3c9593d921e8148f3c29a24092fd5" exitCode=0 Dec 08 20:44:02 crc kubenswrapper[4781]: I1208 20:44:02.287017 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbkfc" event={"ID":"708d3f33-55ef-4c25-997f-5fd13a23c7d0","Type":"ContainerDied","Data":"b34b9b2def69db7dcf00a713d5a3c834f9b3c9593d921e8148f3c29a24092fd5"} Dec 08 20:44:03 crc kubenswrapper[4781]: I1208 20:44:03.298485 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbkfc" event={"ID":"708d3f33-55ef-4c25-997f-5fd13a23c7d0","Type":"ContainerStarted","Data":"adbf12dfa48cb0758ff75e801a2c9f672bc8fd9100b159e95bed53c98be2df23"} Dec 08 20:44:03 crc kubenswrapper[4781]: I1208 20:44:03.321783 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nbkfc" podStartSLOduration=2.706622322 podStartE2EDuration="7.321763904s" podCreationTimestamp="2025-12-08 20:43:56 +0000 UTC" firstStartedPulling="2025-12-08 20:43:58.238666733 +0000 UTC m=+2354.389950110" lastFinishedPulling="2025-12-08 20:44:02.853808315 +0000 UTC m=+2359.005091692" observedRunningTime="2025-12-08 20:44:03.316694108 +0000 UTC m=+2359.467977485" watchObservedRunningTime="2025-12-08 20:44:03.321763904 +0000 UTC m=+2359.473047271" Dec 08 20:44:05 crc kubenswrapper[4781]: I1208 20:44:05.474680 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:44:05 crc kubenswrapper[4781]: I1208 20:44:05.475250 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:44:05 crc kubenswrapper[4781]: I1208 20:44:05.535564 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:44:06 crc kubenswrapper[4781]: I1208 20:44:06.366862 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:44:06 crc kubenswrapper[4781]: I1208 20:44:06.927262 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:44:06 crc kubenswrapper[4781]: I1208 20:44:06.927323 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:44:07 crc kubenswrapper[4781]: I1208 20:44:07.128828 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfvpd"] Dec 08 20:44:07 crc kubenswrapper[4781]: I1208 20:44:07.979128 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nbkfc" podUID="708d3f33-55ef-4c25-997f-5fd13a23c7d0" containerName="registry-server" probeResult="failure" output=< Dec 08 20:44:07 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 08 20:44:07 crc kubenswrapper[4781]: > Dec 08 20:44:08 crc kubenswrapper[4781]: I1208 20:44:08.338126 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nfvpd" podUID="6324d715-b7d1-40d9-8164-d4efd09085cc" containerName="registry-server" containerID="cri-o://b9bcce25243cb12c7362f1c4a6c6af614e2ba26b4ec0567615e23bb2f87f7338" gracePeriod=2 Dec 08 20:44:08 crc kubenswrapper[4781]: I1208 20:44:08.862390 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:44:08 crc kubenswrapper[4781]: I1208 20:44:08.962681 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6324d715-b7d1-40d9-8164-d4efd09085cc-utilities\") pod \"6324d715-b7d1-40d9-8164-d4efd09085cc\" (UID: \"6324d715-b7d1-40d9-8164-d4efd09085cc\") " Dec 08 20:44:08 crc kubenswrapper[4781]: I1208 20:44:08.963096 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb4wz\" (UniqueName: \"kubernetes.io/projected/6324d715-b7d1-40d9-8164-d4efd09085cc-kube-api-access-cb4wz\") pod \"6324d715-b7d1-40d9-8164-d4efd09085cc\" (UID: \"6324d715-b7d1-40d9-8164-d4efd09085cc\") " Dec 08 20:44:08 crc kubenswrapper[4781]: I1208 20:44:08.963367 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6324d715-b7d1-40d9-8164-d4efd09085cc-catalog-content\") pod \"6324d715-b7d1-40d9-8164-d4efd09085cc\" (UID: \"6324d715-b7d1-40d9-8164-d4efd09085cc\") " Dec 08 20:44:08 crc kubenswrapper[4781]: I1208 20:44:08.963642 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6324d715-b7d1-40d9-8164-d4efd09085cc-utilities" (OuterVolumeSpecName: "utilities") pod "6324d715-b7d1-40d9-8164-d4efd09085cc" (UID: "6324d715-b7d1-40d9-8164-d4efd09085cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:44:08 crc kubenswrapper[4781]: I1208 20:44:08.963975 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6324d715-b7d1-40d9-8164-d4efd09085cc-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:44:08 crc kubenswrapper[4781]: I1208 20:44:08.969419 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6324d715-b7d1-40d9-8164-d4efd09085cc-kube-api-access-cb4wz" (OuterVolumeSpecName: "kube-api-access-cb4wz") pod "6324d715-b7d1-40d9-8164-d4efd09085cc" (UID: "6324d715-b7d1-40d9-8164-d4efd09085cc"). InnerVolumeSpecName "kube-api-access-cb4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:44:08 crc kubenswrapper[4781]: I1208 20:44:08.986276 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6324d715-b7d1-40d9-8164-d4efd09085cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6324d715-b7d1-40d9-8164-d4efd09085cc" (UID: "6324d715-b7d1-40d9-8164-d4efd09085cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.066267 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6324d715-b7d1-40d9-8164-d4efd09085cc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.066342 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb4wz\" (UniqueName: \"kubernetes.io/projected/6324d715-b7d1-40d9-8164-d4efd09085cc-kube-api-access-cb4wz\") on node \"crc\" DevicePath \"\"" Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.349218 4781 generic.go:334] "Generic (PLEG): container finished" podID="6324d715-b7d1-40d9-8164-d4efd09085cc" containerID="b9bcce25243cb12c7362f1c4a6c6af614e2ba26b4ec0567615e23bb2f87f7338" exitCode=0 Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.349307 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nfvpd" Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.349348 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfvpd" event={"ID":"6324d715-b7d1-40d9-8164-d4efd09085cc","Type":"ContainerDied","Data":"b9bcce25243cb12c7362f1c4a6c6af614e2ba26b4ec0567615e23bb2f87f7338"} Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.349691 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nfvpd" event={"ID":"6324d715-b7d1-40d9-8164-d4efd09085cc","Type":"ContainerDied","Data":"ca0b691090fbdb4b27d6f4bd88858315322deb8487954134d2493f177a0a9d4e"} Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.349753 4781 scope.go:117] "RemoveContainer" containerID="b9bcce25243cb12c7362f1c4a6c6af614e2ba26b4ec0567615e23bb2f87f7338" Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.379339 4781 scope.go:117] "RemoveContainer" containerID="a4f8ab8824ef5e7bc705015d549e6ffd466bb519353465b1ab2f05f569fe6628" Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.390984 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfvpd"] Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.404547 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nfvpd"] Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.422154 4781 scope.go:117] "RemoveContainer" containerID="13a2b0e374566e79a1742fa88f68b1eda4a0eda4e30b8256db428237b5292dd6" Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.444783 4781 scope.go:117] "RemoveContainer" containerID="b9bcce25243cb12c7362f1c4a6c6af614e2ba26b4ec0567615e23bb2f87f7338" Dec 08 20:44:09 crc kubenswrapper[4781]: E1208 20:44:09.445373 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9bcce25243cb12c7362f1c4a6c6af614e2ba26b4ec0567615e23bb2f87f7338\": container with ID starting with b9bcce25243cb12c7362f1c4a6c6af614e2ba26b4ec0567615e23bb2f87f7338 not found: ID does not exist" containerID="b9bcce25243cb12c7362f1c4a6c6af614e2ba26b4ec0567615e23bb2f87f7338" Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.445412 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9bcce25243cb12c7362f1c4a6c6af614e2ba26b4ec0567615e23bb2f87f7338"} err="failed to get container status \"b9bcce25243cb12c7362f1c4a6c6af614e2ba26b4ec0567615e23bb2f87f7338\": rpc error: code = NotFound desc = could not find container \"b9bcce25243cb12c7362f1c4a6c6af614e2ba26b4ec0567615e23bb2f87f7338\": container with ID starting with b9bcce25243cb12c7362f1c4a6c6af614e2ba26b4ec0567615e23bb2f87f7338 not found: ID does not exist" Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.445449 4781 scope.go:117] "RemoveContainer" containerID="a4f8ab8824ef5e7bc705015d549e6ffd466bb519353465b1ab2f05f569fe6628" Dec 08 20:44:09 crc kubenswrapper[4781]: E1208 20:44:09.445785 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f8ab8824ef5e7bc705015d549e6ffd466bb519353465b1ab2f05f569fe6628\": container with ID starting with a4f8ab8824ef5e7bc705015d549e6ffd466bb519353465b1ab2f05f569fe6628 not found: ID does not exist" containerID="a4f8ab8824ef5e7bc705015d549e6ffd466bb519353465b1ab2f05f569fe6628" Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.445841 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f8ab8824ef5e7bc705015d549e6ffd466bb519353465b1ab2f05f569fe6628"} err="failed to get container status \"a4f8ab8824ef5e7bc705015d549e6ffd466bb519353465b1ab2f05f569fe6628\": rpc error: code = NotFound desc = could not find container \"a4f8ab8824ef5e7bc705015d549e6ffd466bb519353465b1ab2f05f569fe6628\": container with ID starting with a4f8ab8824ef5e7bc705015d549e6ffd466bb519353465b1ab2f05f569fe6628 not found: ID does not exist" Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.445868 4781 scope.go:117] "RemoveContainer" containerID="13a2b0e374566e79a1742fa88f68b1eda4a0eda4e30b8256db428237b5292dd6" Dec 08 20:44:09 crc kubenswrapper[4781]: E1208 20:44:09.446298 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a2b0e374566e79a1742fa88f68b1eda4a0eda4e30b8256db428237b5292dd6\": container with ID starting with 13a2b0e374566e79a1742fa88f68b1eda4a0eda4e30b8256db428237b5292dd6 not found: ID does not exist" containerID="13a2b0e374566e79a1742fa88f68b1eda4a0eda4e30b8256db428237b5292dd6" Dec 08 20:44:09 crc kubenswrapper[4781]: I1208 20:44:09.446322 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a2b0e374566e79a1742fa88f68b1eda4a0eda4e30b8256db428237b5292dd6"} err="failed to get container status \"13a2b0e374566e79a1742fa88f68b1eda4a0eda4e30b8256db428237b5292dd6\": rpc error: code = NotFound desc = could not find container \"13a2b0e374566e79a1742fa88f68b1eda4a0eda4e30b8256db428237b5292dd6\": container with ID starting with 13a2b0e374566e79a1742fa88f68b1eda4a0eda4e30b8256db428237b5292dd6 not found: ID does not exist" Dec 08 20:44:10 crc kubenswrapper[4781]: I1208 20:44:10.137154 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6324d715-b7d1-40d9-8164-d4efd09085cc" path="/var/lib/kubelet/pods/6324d715-b7d1-40d9-8164-d4efd09085cc/volumes" Dec 08 20:44:17 crc kubenswrapper[4781]: I1208 20:44:17.019304 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:44:17 crc kubenswrapper[4781]: I1208 20:44:17.072703 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:44:17 crc kubenswrapper[4781]: I1208 20:44:17.259137 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nbkfc"] Dec 08 20:44:18 crc kubenswrapper[4781]: I1208 20:44:18.427779 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nbkfc" podUID="708d3f33-55ef-4c25-997f-5fd13a23c7d0" containerName="registry-server" containerID="cri-o://adbf12dfa48cb0758ff75e801a2c9f672bc8fd9100b159e95bed53c98be2df23" gracePeriod=2 Dec 08 20:44:18 crc kubenswrapper[4781]: I1208 20:44:18.875683 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:44:18 crc kubenswrapper[4781]: I1208 20:44:18.997026 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/708d3f33-55ef-4c25-997f-5fd13a23c7d0-catalog-content\") pod \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\" (UID: \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\") " Dec 08 20:44:18 crc kubenswrapper[4781]: I1208 20:44:18.997069 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/708d3f33-55ef-4c25-997f-5fd13a23c7d0-utilities\") pod \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\" (UID: \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\") " Dec 08 20:44:18 crc kubenswrapper[4781]: I1208 20:44:18.997264 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pv6k\" (UniqueName: \"kubernetes.io/projected/708d3f33-55ef-4c25-997f-5fd13a23c7d0-kube-api-access-2pv6k\") pod \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\" (UID: \"708d3f33-55ef-4c25-997f-5fd13a23c7d0\") " Dec 08 20:44:18 crc kubenswrapper[4781]: I1208 20:44:18.998158 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/708d3f33-55ef-4c25-997f-5fd13a23c7d0-utilities" (OuterVolumeSpecName: "utilities") pod "708d3f33-55ef-4c25-997f-5fd13a23c7d0" (UID: "708d3f33-55ef-4c25-997f-5fd13a23c7d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.004140 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/708d3f33-55ef-4c25-997f-5fd13a23c7d0-kube-api-access-2pv6k" (OuterVolumeSpecName: "kube-api-access-2pv6k") pod "708d3f33-55ef-4c25-997f-5fd13a23c7d0" (UID: "708d3f33-55ef-4c25-997f-5fd13a23c7d0"). InnerVolumeSpecName "kube-api-access-2pv6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.099201 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/708d3f33-55ef-4c25-997f-5fd13a23c7d0-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.099243 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pv6k\" (UniqueName: \"kubernetes.io/projected/708d3f33-55ef-4c25-997f-5fd13a23c7d0-kube-api-access-2pv6k\") on node \"crc\" DevicePath \"\"" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.110579 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/708d3f33-55ef-4c25-997f-5fd13a23c7d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "708d3f33-55ef-4c25-997f-5fd13a23c7d0" (UID: "708d3f33-55ef-4c25-997f-5fd13a23c7d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.200676 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/708d3f33-55ef-4c25-997f-5fd13a23c7d0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.439491 4781 generic.go:334] "Generic (PLEG): container finished" podID="708d3f33-55ef-4c25-997f-5fd13a23c7d0" containerID="adbf12dfa48cb0758ff75e801a2c9f672bc8fd9100b159e95bed53c98be2df23" exitCode=0 Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.439563 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbkfc" event={"ID":"708d3f33-55ef-4c25-997f-5fd13a23c7d0","Type":"ContainerDied","Data":"adbf12dfa48cb0758ff75e801a2c9f672bc8fd9100b159e95bed53c98be2df23"} Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.439573 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nbkfc" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.439604 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nbkfc" event={"ID":"708d3f33-55ef-4c25-997f-5fd13a23c7d0","Type":"ContainerDied","Data":"250b0e8b00f2227821c27a9402d91393fabea4dbc671a35448435ad19cecb622"} Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.439635 4781 scope.go:117] "RemoveContainer" containerID="adbf12dfa48cb0758ff75e801a2c9f672bc8fd9100b159e95bed53c98be2df23" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.465309 4781 scope.go:117] "RemoveContainer" containerID="b34b9b2def69db7dcf00a713d5a3c834f9b3c9593d921e8148f3c29a24092fd5" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.491010 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nbkfc"] Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.516675 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nbkfc"] Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.528135 4781 scope.go:117] "RemoveContainer" containerID="621a7d3a5de1a4f0ebe761031cdc3eb5999d00ea2f798e4443dab5690e1ae199" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.605195 4781 scope.go:117] "RemoveContainer" containerID="adbf12dfa48cb0758ff75e801a2c9f672bc8fd9100b159e95bed53c98be2df23" Dec 08 20:44:19 crc kubenswrapper[4781]: E1208 20:44:19.610348 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adbf12dfa48cb0758ff75e801a2c9f672bc8fd9100b159e95bed53c98be2df23\": container with ID starting with adbf12dfa48cb0758ff75e801a2c9f672bc8fd9100b159e95bed53c98be2df23 not found: ID does not exist" containerID="adbf12dfa48cb0758ff75e801a2c9f672bc8fd9100b159e95bed53c98be2df23" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.610396 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adbf12dfa48cb0758ff75e801a2c9f672bc8fd9100b159e95bed53c98be2df23"} err="failed to get container status \"adbf12dfa48cb0758ff75e801a2c9f672bc8fd9100b159e95bed53c98be2df23\": rpc error: code = NotFound desc = could not find container \"adbf12dfa48cb0758ff75e801a2c9f672bc8fd9100b159e95bed53c98be2df23\": container with ID starting with adbf12dfa48cb0758ff75e801a2c9f672bc8fd9100b159e95bed53c98be2df23 not found: ID does not exist" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.610419 4781 scope.go:117] "RemoveContainer" containerID="b34b9b2def69db7dcf00a713d5a3c834f9b3c9593d921e8148f3c29a24092fd5" Dec 08 20:44:19 crc kubenswrapper[4781]: E1208 20:44:19.615061 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34b9b2def69db7dcf00a713d5a3c834f9b3c9593d921e8148f3c29a24092fd5\": container with ID starting with b34b9b2def69db7dcf00a713d5a3c834f9b3c9593d921e8148f3c29a24092fd5 not found: ID does not exist" containerID="b34b9b2def69db7dcf00a713d5a3c834f9b3c9593d921e8148f3c29a24092fd5" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.615104 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34b9b2def69db7dcf00a713d5a3c834f9b3c9593d921e8148f3c29a24092fd5"} err="failed to get container status \"b34b9b2def69db7dcf00a713d5a3c834f9b3c9593d921e8148f3c29a24092fd5\": rpc error: code = NotFound desc = could not find container \"b34b9b2def69db7dcf00a713d5a3c834f9b3c9593d921e8148f3c29a24092fd5\": container with ID starting with b34b9b2def69db7dcf00a713d5a3c834f9b3c9593d921e8148f3c29a24092fd5 not found: ID does not exist" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.615128 4781 scope.go:117] "RemoveContainer" containerID="621a7d3a5de1a4f0ebe761031cdc3eb5999d00ea2f798e4443dab5690e1ae199" Dec 08 20:44:19 crc kubenswrapper[4781]: E1208 20:44:19.615508 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"621a7d3a5de1a4f0ebe761031cdc3eb5999d00ea2f798e4443dab5690e1ae199\": container with ID starting with 621a7d3a5de1a4f0ebe761031cdc3eb5999d00ea2f798e4443dab5690e1ae199 not found: ID does not exist" containerID="621a7d3a5de1a4f0ebe761031cdc3eb5999d00ea2f798e4443dab5690e1ae199" Dec 08 20:44:19 crc kubenswrapper[4781]: I1208 20:44:19.615529 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"621a7d3a5de1a4f0ebe761031cdc3eb5999d00ea2f798e4443dab5690e1ae199"} err="failed to get container status \"621a7d3a5de1a4f0ebe761031cdc3eb5999d00ea2f798e4443dab5690e1ae199\": rpc error: code = NotFound desc = could not find container \"621a7d3a5de1a4f0ebe761031cdc3eb5999d00ea2f798e4443dab5690e1ae199\": container with ID starting with 621a7d3a5de1a4f0ebe761031cdc3eb5999d00ea2f798e4443dab5690e1ae199 not found: ID does not exist" Dec 08 20:44:20 crc kubenswrapper[4781]: I1208 20:44:20.136217 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="708d3f33-55ef-4c25-997f-5fd13a23c7d0" path="/var/lib/kubelet/pods/708d3f33-55ef-4c25-997f-5fd13a23c7d0/volumes" Dec 08 20:44:29 crc kubenswrapper[4781]: I1208 20:44:29.964990 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:44:29 crc kubenswrapper[4781]: I1208 20:44:29.965568 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:44:59 crc kubenswrapper[4781]: I1208 20:44:59.947853 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:44:59 crc kubenswrapper[4781]: I1208 20:44:59.948412 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:44:59 crc kubenswrapper[4781]: I1208 20:44:59.948483 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:44:59 crc kubenswrapper[4781]: I1208 20:44:59.949248 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 20:44:59 crc kubenswrapper[4781]: I1208 20:44:59.949300 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" gracePeriod=600 Dec 08 20:45:00 crc kubenswrapper[4781]: E1208 20:45:00.072601 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.166148 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b"] Dec 08 20:45:00 crc kubenswrapper[4781]: E1208 20:45:00.166885 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6324d715-b7d1-40d9-8164-d4efd09085cc" containerName="extract-content" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.166930 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6324d715-b7d1-40d9-8164-d4efd09085cc" containerName="extract-content" Dec 08 20:45:00 crc kubenswrapper[4781]: E1208 20:45:00.166993 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708d3f33-55ef-4c25-997f-5fd13a23c7d0" containerName="registry-server" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.167005 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="708d3f33-55ef-4c25-997f-5fd13a23c7d0" containerName="registry-server" Dec 08 20:45:00 crc kubenswrapper[4781]: E1208 20:45:00.167017 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6324d715-b7d1-40d9-8164-d4efd09085cc" containerName="extract-utilities" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.167027 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6324d715-b7d1-40d9-8164-d4efd09085cc" containerName="extract-utilities" Dec 08 20:45:00 crc kubenswrapper[4781]: E1208 20:45:00.167050 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708d3f33-55ef-4c25-997f-5fd13a23c7d0" containerName="extract-utilities" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.167059 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="708d3f33-55ef-4c25-997f-5fd13a23c7d0" containerName="extract-utilities" Dec 08 20:45:00 crc kubenswrapper[4781]: E1208 20:45:00.167085 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6324d715-b7d1-40d9-8164-d4efd09085cc" containerName="registry-server" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.167095 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="6324d715-b7d1-40d9-8164-d4efd09085cc" containerName="registry-server" Dec 08 20:45:00 crc kubenswrapper[4781]: E1208 20:45:00.167109 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708d3f33-55ef-4c25-997f-5fd13a23c7d0" containerName="extract-content" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.167117 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="708d3f33-55ef-4c25-997f-5fd13a23c7d0" containerName="extract-content" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.167357 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="708d3f33-55ef-4c25-997f-5fd13a23c7d0" containerName="registry-server" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.167387 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="6324d715-b7d1-40d9-8164-d4efd09085cc" containerName="registry-server" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.168483 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.173229 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.173495 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.176802 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b"] Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.257578 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce658f2a-dd91-4746-8435-361f5c961cb5-secret-volume\") pod \"collect-profiles-29420445-8rw4b\" (UID: \"ce658f2a-dd91-4746-8435-361f5c961cb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.257677 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce658f2a-dd91-4746-8435-361f5c961cb5-config-volume\") pod \"collect-profiles-29420445-8rw4b\" (UID: \"ce658f2a-dd91-4746-8435-361f5c961cb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.257736 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9xrg\" (UniqueName: \"kubernetes.io/projected/ce658f2a-dd91-4746-8435-361f5c961cb5-kube-api-access-g9xrg\") pod \"collect-profiles-29420445-8rw4b\" (UID: \"ce658f2a-dd91-4746-8435-361f5c961cb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.358886 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce658f2a-dd91-4746-8435-361f5c961cb5-secret-volume\") pod \"collect-profiles-29420445-8rw4b\" (UID: \"ce658f2a-dd91-4746-8435-361f5c961cb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.358953 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce658f2a-dd91-4746-8435-361f5c961cb5-config-volume\") pod \"collect-profiles-29420445-8rw4b\" (UID: \"ce658f2a-dd91-4746-8435-361f5c961cb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.358988 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9xrg\" (UniqueName: \"kubernetes.io/projected/ce658f2a-dd91-4746-8435-361f5c961cb5-kube-api-access-g9xrg\") pod \"collect-profiles-29420445-8rw4b\" (UID: \"ce658f2a-dd91-4746-8435-361f5c961cb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.360389 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce658f2a-dd91-4746-8435-361f5c961cb5-config-volume\") pod \"collect-profiles-29420445-8rw4b\" (UID: \"ce658f2a-dd91-4746-8435-361f5c961cb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.366260 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce658f2a-dd91-4746-8435-361f5c961cb5-secret-volume\") pod \"collect-profiles-29420445-8rw4b\" (UID: \"ce658f2a-dd91-4746-8435-361f5c961cb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.377577 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9xrg\" (UniqueName: \"kubernetes.io/projected/ce658f2a-dd91-4746-8435-361f5c961cb5-kube-api-access-g9xrg\") pod \"collect-profiles-29420445-8rw4b\" (UID: \"ce658f2a-dd91-4746-8435-361f5c961cb5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.495138 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.820751 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" exitCode=0 Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.820833 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465"} Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.821084 4781 scope.go:117] "RemoveContainer" containerID="4ecf2f2c52e8a643c4ad224bdcf72c4081cef5a126f0f1cdf084fd2fe65b6b95" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.821738 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:45:00 crc kubenswrapper[4781]: E1208 20:45:00.822206 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:45:00 crc kubenswrapper[4781]: I1208 20:45:00.938441 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b"] Dec 08 20:45:01 crc kubenswrapper[4781]: I1208 20:45:01.831230 4781 generic.go:334] "Generic (PLEG): container finished" podID="ce658f2a-dd91-4746-8435-361f5c961cb5" containerID="e575872bf4e44074063b40be5e9acb2ba7fe1bf7ee4036c9d844138551e6e4bb" exitCode=0 Dec 08 20:45:01 crc kubenswrapper[4781]: I1208 20:45:01.831363 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" event={"ID":"ce658f2a-dd91-4746-8435-361f5c961cb5","Type":"ContainerDied","Data":"e575872bf4e44074063b40be5e9acb2ba7fe1bf7ee4036c9d844138551e6e4bb"} Dec 08 20:45:01 crc kubenswrapper[4781]: I1208 20:45:01.831585 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" event={"ID":"ce658f2a-dd91-4746-8435-361f5c961cb5","Type":"ContainerStarted","Data":"44e893cdf2152e02cac01754ed9c5c7d64a3890c0aa11e68a0b637378008a2ed"} Dec 08 20:45:03 crc kubenswrapper[4781]: I1208 20:45:03.196321 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" Dec 08 20:45:03 crc kubenswrapper[4781]: I1208 20:45:03.325371 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce658f2a-dd91-4746-8435-361f5c961cb5-config-volume\") pod \"ce658f2a-dd91-4746-8435-361f5c961cb5\" (UID: \"ce658f2a-dd91-4746-8435-361f5c961cb5\") " Dec 08 20:45:03 crc kubenswrapper[4781]: I1208 20:45:03.325465 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce658f2a-dd91-4746-8435-361f5c961cb5-secret-volume\") pod \"ce658f2a-dd91-4746-8435-361f5c961cb5\" (UID: \"ce658f2a-dd91-4746-8435-361f5c961cb5\") " Dec 08 20:45:03 crc kubenswrapper[4781]: I1208 20:45:03.325633 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9xrg\" (UniqueName: \"kubernetes.io/projected/ce658f2a-dd91-4746-8435-361f5c961cb5-kube-api-access-g9xrg\") pod \"ce658f2a-dd91-4746-8435-361f5c961cb5\" (UID: \"ce658f2a-dd91-4746-8435-361f5c961cb5\") " Dec 08 20:45:03 crc kubenswrapper[4781]: I1208 20:45:03.326659 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce658f2a-dd91-4746-8435-361f5c961cb5-config-volume" (OuterVolumeSpecName: "config-volume") pod "ce658f2a-dd91-4746-8435-361f5c961cb5" (UID: "ce658f2a-dd91-4746-8435-361f5c961cb5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:45:03 crc kubenswrapper[4781]: I1208 20:45:03.332220 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce658f2a-dd91-4746-8435-361f5c961cb5-kube-api-access-g9xrg" (OuterVolumeSpecName: "kube-api-access-g9xrg") pod "ce658f2a-dd91-4746-8435-361f5c961cb5" (UID: "ce658f2a-dd91-4746-8435-361f5c961cb5"). InnerVolumeSpecName "kube-api-access-g9xrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:45:03 crc kubenswrapper[4781]: I1208 20:45:03.332392 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce658f2a-dd91-4746-8435-361f5c961cb5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ce658f2a-dd91-4746-8435-361f5c961cb5" (UID: "ce658f2a-dd91-4746-8435-361f5c961cb5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:45:03 crc kubenswrapper[4781]: I1208 20:45:03.429890 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9xrg\" (UniqueName: \"kubernetes.io/projected/ce658f2a-dd91-4746-8435-361f5c961cb5-kube-api-access-g9xrg\") on node \"crc\" DevicePath \"\"" Dec 08 20:45:03 crc kubenswrapper[4781]: I1208 20:45:03.429990 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce658f2a-dd91-4746-8435-361f5c961cb5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 20:45:03 crc kubenswrapper[4781]: I1208 20:45:03.430006 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ce658f2a-dd91-4746-8435-361f5c961cb5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 20:45:03 crc kubenswrapper[4781]: I1208 20:45:03.849435 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" event={"ID":"ce658f2a-dd91-4746-8435-361f5c961cb5","Type":"ContainerDied","Data":"44e893cdf2152e02cac01754ed9c5c7d64a3890c0aa11e68a0b637378008a2ed"} Dec 08 20:45:03 crc kubenswrapper[4781]: I1208 20:45:03.849730 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44e893cdf2152e02cac01754ed9c5c7d64a3890c0aa11e68a0b637378008a2ed" Dec 08 20:45:03 crc kubenswrapper[4781]: I1208 20:45:03.849500 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420445-8rw4b" Dec 08 20:45:04 crc kubenswrapper[4781]: I1208 20:45:04.272958 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm"] Dec 08 20:45:04 crc kubenswrapper[4781]: I1208 20:45:04.280633 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420400-xh5nm"] Dec 08 20:45:06 crc kubenswrapper[4781]: I1208 20:45:06.137969 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23d965c6-71e4-4594-a70e-aad1e2b24c3f" path="/var/lib/kubelet/pods/23d965c6-71e4-4594-a70e-aad1e2b24c3f/volumes" Dec 08 20:45:13 crc kubenswrapper[4781]: I1208 20:45:13.692534 4781 scope.go:117] "RemoveContainer" containerID="ca17385bea9f01b4b036c14fe4357bd0b051eacdb6cbf7ecf3be6e1db5d6016b" Dec 08 20:45:16 crc kubenswrapper[4781]: I1208 20:45:16.125756 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:45:16 crc kubenswrapper[4781]: E1208 20:45:16.126608 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:45:27 crc kubenswrapper[4781]: I1208 20:45:27.125739 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:45:27 crc kubenswrapper[4781]: E1208 20:45:27.126486 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:45:40 crc kubenswrapper[4781]: I1208 20:45:40.126412 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:45:40 crc kubenswrapper[4781]: E1208 20:45:40.127203 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:45:51 crc kubenswrapper[4781]: I1208 20:45:51.149499 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:45:51 crc kubenswrapper[4781]: E1208 20:45:51.150721 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:46:05 crc kubenswrapper[4781]: I1208 20:46:05.126244 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:46:05 crc kubenswrapper[4781]: E1208 20:46:05.127015 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:46:17 crc kubenswrapper[4781]: I1208 20:46:17.126966 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:46:17 crc kubenswrapper[4781]: E1208 20:46:17.128141 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:46:31 crc kubenswrapper[4781]: I1208 20:46:31.126378 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:46:31 crc kubenswrapper[4781]: E1208 20:46:31.127710 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:46:32 crc kubenswrapper[4781]: I1208 20:46:32.895814 4781 generic.go:334] "Generic (PLEG): container finished" podID="69ce8819-1a24-4b28-9438-c92c07b4dbca" containerID="4983761f2c57d0281c3ba9769cca57be3eb37e147e5c0cea0db36ec7089dd83c" exitCode=0 Dec 08 20:46:32 crc kubenswrapper[4781]: I1208 20:46:32.895889 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" event={"ID":"69ce8819-1a24-4b28-9438-c92c07b4dbca","Type":"ContainerDied","Data":"4983761f2c57d0281c3ba9769cca57be3eb37e147e5c0cea0db36ec7089dd83c"} Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.349570 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.380098 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-migration-ssh-key-0\") pod \"69ce8819-1a24-4b28-9438-c92c07b4dbca\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.380175 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w9xc\" (UniqueName: \"kubernetes.io/projected/69ce8819-1a24-4b28-9438-c92c07b4dbca-kube-api-access-6w9xc\") pod \"69ce8819-1a24-4b28-9438-c92c07b4dbca\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.380263 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-extra-config-0\") pod \"69ce8819-1a24-4b28-9438-c92c07b4dbca\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.380310 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-cell1-compute-config-0\") pod \"69ce8819-1a24-4b28-9438-c92c07b4dbca\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.380340 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-ssh-key\") pod \"69ce8819-1a24-4b28-9438-c92c07b4dbca\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.380368 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-inventory\") pod \"69ce8819-1a24-4b28-9438-c92c07b4dbca\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.380443 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-cell1-compute-config-1\") pod \"69ce8819-1a24-4b28-9438-c92c07b4dbca\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.380510 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-combined-ca-bundle\") pod \"69ce8819-1a24-4b28-9438-c92c07b4dbca\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.380555 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-migration-ssh-key-1\") pod \"69ce8819-1a24-4b28-9438-c92c07b4dbca\" (UID: \"69ce8819-1a24-4b28-9438-c92c07b4dbca\") " Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.431176 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ce8819-1a24-4b28-9438-c92c07b4dbca-kube-api-access-6w9xc" (OuterVolumeSpecName: "kube-api-access-6w9xc") pod "69ce8819-1a24-4b28-9438-c92c07b4dbca" (UID: "69ce8819-1a24-4b28-9438-c92c07b4dbca"). InnerVolumeSpecName "kube-api-access-6w9xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.431996 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "69ce8819-1a24-4b28-9438-c92c07b4dbca" (UID: "69ce8819-1a24-4b28-9438-c92c07b4dbca"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.435121 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "69ce8819-1a24-4b28-9438-c92c07b4dbca" (UID: "69ce8819-1a24-4b28-9438-c92c07b4dbca"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.439770 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "69ce8819-1a24-4b28-9438-c92c07b4dbca" (UID: "69ce8819-1a24-4b28-9438-c92c07b4dbca"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.458828 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "69ce8819-1a24-4b28-9438-c92c07b4dbca" (UID: "69ce8819-1a24-4b28-9438-c92c07b4dbca"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.460543 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-inventory" (OuterVolumeSpecName: "inventory") pod "69ce8819-1a24-4b28-9438-c92c07b4dbca" (UID: "69ce8819-1a24-4b28-9438-c92c07b4dbca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.464503 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "69ce8819-1a24-4b28-9438-c92c07b4dbca" (UID: "69ce8819-1a24-4b28-9438-c92c07b4dbca"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.467123 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "69ce8819-1a24-4b28-9438-c92c07b4dbca" (UID: "69ce8819-1a24-4b28-9438-c92c07b4dbca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.469325 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "69ce8819-1a24-4b28-9438-c92c07b4dbca" (UID: "69ce8819-1a24-4b28-9438-c92c07b4dbca"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.483388 4781 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.483427 4781 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.483439 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w9xc\" (UniqueName: \"kubernetes.io/projected/69ce8819-1a24-4b28-9438-c92c07b4dbca-kube-api-access-6w9xc\") on node \"crc\" DevicePath \"\"" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.483451 4781 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.483463 4781 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.483474 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.483486 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.483497 4781 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.483507 4781 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69ce8819-1a24-4b28-9438-c92c07b4dbca-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.921298 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" event={"ID":"69ce8819-1a24-4b28-9438-c92c07b4dbca","Type":"ContainerDied","Data":"0d4674cee71310bbd3c56d7cefc79fb3a912e71103a7f7d3850eecef3b26f049"} Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.921371 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d4674cee71310bbd3c56d7cefc79fb3a912e71103a7f7d3850eecef3b26f049" Dec 08 20:46:34 crc kubenswrapper[4781]: I1208 20:46:34.921385 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-5rsvg" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.065431 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf"] Dec 08 20:46:35 crc kubenswrapper[4781]: E1208 20:46:35.066047 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ce8819-1a24-4b28-9438-c92c07b4dbca" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.066071 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ce8819-1a24-4b28-9438-c92c07b4dbca" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 08 20:46:35 crc kubenswrapper[4781]: E1208 20:46:35.066115 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce658f2a-dd91-4746-8435-361f5c961cb5" containerName="collect-profiles" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.066125 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce658f2a-dd91-4746-8435-361f5c961cb5" containerName="collect-profiles" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.066360 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce658f2a-dd91-4746-8435-361f5c961cb5" containerName="collect-profiles" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.066406 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ce8819-1a24-4b28-9438-c92c07b4dbca" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.067221 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.075587 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.075648 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-2kfl7" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.075603 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.075853 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.075993 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.085471 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf"] Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.096183 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.096256 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.096278 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.096296 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwzlb\" (UniqueName: \"kubernetes.io/projected/9c03b281-e533-4108-9eff-0930b52141ca-kube-api-access-kwzlb\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.096315 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.096338 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.096412 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.198418 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.198503 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.198523 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.199361 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwzlb\" (UniqueName: \"kubernetes.io/projected/9c03b281-e533-4108-9eff-0930b52141ca-kube-api-access-kwzlb\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.199394 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.199692 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.199884 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.204136 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.204224 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.218433 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.218476 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.218439 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.219343 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.224981 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwzlb\" (UniqueName: \"kubernetes.io/projected/9c03b281-e533-4108-9eff-0930b52141ca-kube-api-access-kwzlb\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.396330 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:46:35 crc kubenswrapper[4781]: I1208 20:46:35.970034 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf"] Dec 08 20:46:36 crc kubenswrapper[4781]: I1208 20:46:36.951216 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" event={"ID":"9c03b281-e533-4108-9eff-0930b52141ca","Type":"ContainerStarted","Data":"376b6e2f0f8e239f1b006a1a1ae5a890e0d137fb49f894e3ae88735b8e2eb898"} Dec 08 20:46:36 crc kubenswrapper[4781]: I1208 20:46:36.952084 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" event={"ID":"9c03b281-e533-4108-9eff-0930b52141ca","Type":"ContainerStarted","Data":"e8270877385c988c1dc0080c7b9f82195c77a1de1fef4f66b34d96cbb4a3771a"} Dec 08 20:46:36 crc kubenswrapper[4781]: I1208 20:46:36.989109 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" podStartSLOduration=1.474909991 podStartE2EDuration="1.989089803s" podCreationTimestamp="2025-12-08 20:46:35 +0000 UTC" firstStartedPulling="2025-12-08 20:46:35.984557143 +0000 UTC m=+2512.135840520" lastFinishedPulling="2025-12-08 20:46:36.498736905 +0000 UTC m=+2512.650020332" observedRunningTime="2025-12-08 20:46:36.98445757 +0000 UTC m=+2513.135740967" watchObservedRunningTime="2025-12-08 20:46:36.989089803 +0000 UTC m=+2513.140373190" Dec 08 20:46:43 crc kubenswrapper[4781]: I1208 20:46:43.126546 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:46:43 crc kubenswrapper[4781]: E1208 20:46:43.127591 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:46:57 crc kubenswrapper[4781]: I1208 20:46:57.126397 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:46:57 crc kubenswrapper[4781]: E1208 20:46:57.127233 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:47:09 crc kubenswrapper[4781]: I1208 20:47:09.126500 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:47:09 crc kubenswrapper[4781]: E1208 20:47:09.127282 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:47:23 crc kubenswrapper[4781]: I1208 20:47:23.126460 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:47:23 crc kubenswrapper[4781]: E1208 20:47:23.127780 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:47:31 crc kubenswrapper[4781]: I1208 20:47:31.281967 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kr447"] Dec 08 20:47:31 crc kubenswrapper[4781]: I1208 20:47:31.284545 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:31 crc kubenswrapper[4781]: I1208 20:47:31.301816 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kr447"] Dec 08 20:47:31 crc kubenswrapper[4781]: I1208 20:47:31.325180 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3d84d88-4c12-4072-9dd7-df22f28ce6af-catalog-content\") pod \"certified-operators-kr447\" (UID: \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\") " pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:31 crc kubenswrapper[4781]: I1208 20:47:31.325275 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3d84d88-4c12-4072-9dd7-df22f28ce6af-utilities\") pod \"certified-operators-kr447\" (UID: \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\") " pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:31 crc kubenswrapper[4781]: I1208 20:47:31.325626 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljchc\" (UniqueName: \"kubernetes.io/projected/e3d84d88-4c12-4072-9dd7-df22f28ce6af-kube-api-access-ljchc\") pod \"certified-operators-kr447\" (UID: \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\") " pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:31 crc kubenswrapper[4781]: I1208 20:47:31.427224 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3d84d88-4c12-4072-9dd7-df22f28ce6af-catalog-content\") pod \"certified-operators-kr447\" (UID: \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\") " pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:31 crc kubenswrapper[4781]: I1208 20:47:31.427299 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3d84d88-4c12-4072-9dd7-df22f28ce6af-utilities\") pod \"certified-operators-kr447\" (UID: \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\") " pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:31 crc kubenswrapper[4781]: I1208 20:47:31.427359 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljchc\" (UniqueName: \"kubernetes.io/projected/e3d84d88-4c12-4072-9dd7-df22f28ce6af-kube-api-access-ljchc\") pod \"certified-operators-kr447\" (UID: \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\") " pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:31 crc kubenswrapper[4781]: I1208 20:47:31.427767 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3d84d88-4c12-4072-9dd7-df22f28ce6af-catalog-content\") pod \"certified-operators-kr447\" (UID: \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\") " pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:31 crc kubenswrapper[4781]: I1208 20:47:31.427981 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3d84d88-4c12-4072-9dd7-df22f28ce6af-utilities\") pod \"certified-operators-kr447\" (UID: \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\") " pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:31 crc kubenswrapper[4781]: I1208 20:47:31.452734 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljchc\" (UniqueName: \"kubernetes.io/projected/e3d84d88-4c12-4072-9dd7-df22f28ce6af-kube-api-access-ljchc\") pod \"certified-operators-kr447\" (UID: \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\") " pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:31 crc kubenswrapper[4781]: I1208 20:47:31.603824 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:32 crc kubenswrapper[4781]: W1208 20:47:32.140853 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3d84d88_4c12_4072_9dd7_df22f28ce6af.slice/crio-48f78aa81e428c0d35cc2a9ffd31cb1943c962f540267d699bf99ecdfbc55230 WatchSource:0}: Error finding container 48f78aa81e428c0d35cc2a9ffd31cb1943c962f540267d699bf99ecdfbc55230: Status 404 returned error can't find the container with id 48f78aa81e428c0d35cc2a9ffd31cb1943c962f540267d699bf99ecdfbc55230 Dec 08 20:47:32 crc kubenswrapper[4781]: I1208 20:47:32.142478 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kr447"] Dec 08 20:47:32 crc kubenswrapper[4781]: I1208 20:47:32.543108 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kr447" event={"ID":"e3d84d88-4c12-4072-9dd7-df22f28ce6af","Type":"ContainerDied","Data":"c8671291f19032788b2ff2e949803ea05e361d3d14b5acd1fea0ed431f2883bd"} Dec 08 20:47:32 crc kubenswrapper[4781]: I1208 20:47:32.544035 4781 generic.go:334] "Generic (PLEG): container finished" podID="e3d84d88-4c12-4072-9dd7-df22f28ce6af" containerID="c8671291f19032788b2ff2e949803ea05e361d3d14b5acd1fea0ed431f2883bd" exitCode=0 Dec 08 20:47:32 crc kubenswrapper[4781]: I1208 20:47:32.544112 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kr447" event={"ID":"e3d84d88-4c12-4072-9dd7-df22f28ce6af","Type":"ContainerStarted","Data":"48f78aa81e428c0d35cc2a9ffd31cb1943c962f540267d699bf99ecdfbc55230"} Dec 08 20:47:33 crc kubenswrapper[4781]: I1208 20:47:33.554140 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kr447" event={"ID":"e3d84d88-4c12-4072-9dd7-df22f28ce6af","Type":"ContainerDied","Data":"2d8977fd462bce3abaca0b37279aa5c6c4caf1d169b134e4d3369c55cdbfed2f"} Dec 08 20:47:33 crc kubenswrapper[4781]: I1208 20:47:33.555097 4781 generic.go:334] "Generic (PLEG): container finished" podID="e3d84d88-4c12-4072-9dd7-df22f28ce6af" containerID="2d8977fd462bce3abaca0b37279aa5c6c4caf1d169b134e4d3369c55cdbfed2f" exitCode=0 Dec 08 20:47:34 crc kubenswrapper[4781]: I1208 20:47:34.567392 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kr447" event={"ID":"e3d84d88-4c12-4072-9dd7-df22f28ce6af","Type":"ContainerStarted","Data":"0fac9241c5a2339b2c51098d68874aa9993e9b17208ac12ef952ea1d719bd2b2"} Dec 08 20:47:34 crc kubenswrapper[4781]: I1208 20:47:34.595134 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kr447" podStartSLOduration=1.953656422 podStartE2EDuration="3.595111514s" podCreationTimestamp="2025-12-08 20:47:31 +0000 UTC" firstStartedPulling="2025-12-08 20:47:32.544696918 +0000 UTC m=+2568.695980296" lastFinishedPulling="2025-12-08 20:47:34.186152021 +0000 UTC m=+2570.337435388" observedRunningTime="2025-12-08 20:47:34.584533551 +0000 UTC m=+2570.735816978" watchObservedRunningTime="2025-12-08 20:47:34.595111514 +0000 UTC m=+2570.746394891" Dec 08 20:47:38 crc kubenswrapper[4781]: I1208 20:47:38.125717 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:47:38 crc kubenswrapper[4781]: E1208 20:47:38.126554 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:47:41 crc kubenswrapper[4781]: I1208 20:47:41.604823 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:41 crc kubenswrapper[4781]: I1208 20:47:41.605274 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:41 crc kubenswrapper[4781]: I1208 20:47:41.651158 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:41 crc kubenswrapper[4781]: I1208 20:47:41.755618 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:42 crc kubenswrapper[4781]: I1208 20:47:42.268629 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kr447"] Dec 08 20:47:43 crc kubenswrapper[4781]: I1208 20:47:43.647034 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kr447" podUID="e3d84d88-4c12-4072-9dd7-df22f28ce6af" containerName="registry-server" containerID="cri-o://0fac9241c5a2339b2c51098d68874aa9993e9b17208ac12ef952ea1d719bd2b2" gracePeriod=2 Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.235553 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.267676 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3d84d88-4c12-4072-9dd7-df22f28ce6af-catalog-content\") pod \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\" (UID: \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\") " Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.267728 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3d84d88-4c12-4072-9dd7-df22f28ce6af-utilities\") pod \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\" (UID: \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\") " Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.267785 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljchc\" (UniqueName: \"kubernetes.io/projected/e3d84d88-4c12-4072-9dd7-df22f28ce6af-kube-api-access-ljchc\") pod \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\" (UID: \"e3d84d88-4c12-4072-9dd7-df22f28ce6af\") " Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.272590 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3d84d88-4c12-4072-9dd7-df22f28ce6af-utilities" (OuterVolumeSpecName: "utilities") pod "e3d84d88-4c12-4072-9dd7-df22f28ce6af" (UID: "e3d84d88-4c12-4072-9dd7-df22f28ce6af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.275331 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3d84d88-4c12-4072-9dd7-df22f28ce6af-kube-api-access-ljchc" (OuterVolumeSpecName: "kube-api-access-ljchc") pod "e3d84d88-4c12-4072-9dd7-df22f28ce6af" (UID: "e3d84d88-4c12-4072-9dd7-df22f28ce6af"). InnerVolumeSpecName "kube-api-access-ljchc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.322467 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3d84d88-4c12-4072-9dd7-df22f28ce6af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3d84d88-4c12-4072-9dd7-df22f28ce6af" (UID: "e3d84d88-4c12-4072-9dd7-df22f28ce6af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.370531 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3d84d88-4c12-4072-9dd7-df22f28ce6af-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.370847 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3d84d88-4c12-4072-9dd7-df22f28ce6af-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.370857 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljchc\" (UniqueName: \"kubernetes.io/projected/e3d84d88-4c12-4072-9dd7-df22f28ce6af-kube-api-access-ljchc\") on node \"crc\" DevicePath \"\"" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.656879 4781 generic.go:334] "Generic (PLEG): container finished" podID="e3d84d88-4c12-4072-9dd7-df22f28ce6af" containerID="0fac9241c5a2339b2c51098d68874aa9993e9b17208ac12ef952ea1d719bd2b2" exitCode=0 Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.656992 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kr447" event={"ID":"e3d84d88-4c12-4072-9dd7-df22f28ce6af","Type":"ContainerDied","Data":"0fac9241c5a2339b2c51098d68874aa9993e9b17208ac12ef952ea1d719bd2b2"} Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.657045 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kr447" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.657090 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kr447" event={"ID":"e3d84d88-4c12-4072-9dd7-df22f28ce6af","Type":"ContainerDied","Data":"48f78aa81e428c0d35cc2a9ffd31cb1943c962f540267d699bf99ecdfbc55230"} Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.657138 4781 scope.go:117] "RemoveContainer" containerID="0fac9241c5a2339b2c51098d68874aa9993e9b17208ac12ef952ea1d719bd2b2" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.684047 4781 scope.go:117] "RemoveContainer" containerID="2d8977fd462bce3abaca0b37279aa5c6c4caf1d169b134e4d3369c55cdbfed2f" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.720655 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kr447"] Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.731308 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kr447"] Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.739709 4781 scope.go:117] "RemoveContainer" containerID="c8671291f19032788b2ff2e949803ea05e361d3d14b5acd1fea0ed431f2883bd" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.773048 4781 scope.go:117] "RemoveContainer" containerID="0fac9241c5a2339b2c51098d68874aa9993e9b17208ac12ef952ea1d719bd2b2" Dec 08 20:47:44 crc kubenswrapper[4781]: E1208 20:47:44.773706 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fac9241c5a2339b2c51098d68874aa9993e9b17208ac12ef952ea1d719bd2b2\": container with ID starting with 0fac9241c5a2339b2c51098d68874aa9993e9b17208ac12ef952ea1d719bd2b2 not found: ID does not exist" containerID="0fac9241c5a2339b2c51098d68874aa9993e9b17208ac12ef952ea1d719bd2b2" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.773745 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fac9241c5a2339b2c51098d68874aa9993e9b17208ac12ef952ea1d719bd2b2"} err="failed to get container status \"0fac9241c5a2339b2c51098d68874aa9993e9b17208ac12ef952ea1d719bd2b2\": rpc error: code = NotFound desc = could not find container \"0fac9241c5a2339b2c51098d68874aa9993e9b17208ac12ef952ea1d719bd2b2\": container with ID starting with 0fac9241c5a2339b2c51098d68874aa9993e9b17208ac12ef952ea1d719bd2b2 not found: ID does not exist" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.773775 4781 scope.go:117] "RemoveContainer" containerID="2d8977fd462bce3abaca0b37279aa5c6c4caf1d169b134e4d3369c55cdbfed2f" Dec 08 20:47:44 crc kubenswrapper[4781]: E1208 20:47:44.774381 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d8977fd462bce3abaca0b37279aa5c6c4caf1d169b134e4d3369c55cdbfed2f\": container with ID starting with 2d8977fd462bce3abaca0b37279aa5c6c4caf1d169b134e4d3369c55cdbfed2f not found: ID does not exist" containerID="2d8977fd462bce3abaca0b37279aa5c6c4caf1d169b134e4d3369c55cdbfed2f" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.774464 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d8977fd462bce3abaca0b37279aa5c6c4caf1d169b134e4d3369c55cdbfed2f"} err="failed to get container status \"2d8977fd462bce3abaca0b37279aa5c6c4caf1d169b134e4d3369c55cdbfed2f\": rpc error: code = NotFound desc = could not find container \"2d8977fd462bce3abaca0b37279aa5c6c4caf1d169b134e4d3369c55cdbfed2f\": container with ID starting with 2d8977fd462bce3abaca0b37279aa5c6c4caf1d169b134e4d3369c55cdbfed2f not found: ID does not exist" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.774497 4781 scope.go:117] "RemoveContainer" containerID="c8671291f19032788b2ff2e949803ea05e361d3d14b5acd1fea0ed431f2883bd" Dec 08 20:47:44 crc kubenswrapper[4781]: E1208 20:47:44.774985 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8671291f19032788b2ff2e949803ea05e361d3d14b5acd1fea0ed431f2883bd\": container with ID starting with c8671291f19032788b2ff2e949803ea05e361d3d14b5acd1fea0ed431f2883bd not found: ID does not exist" containerID="c8671291f19032788b2ff2e949803ea05e361d3d14b5acd1fea0ed431f2883bd" Dec 08 20:47:44 crc kubenswrapper[4781]: I1208 20:47:44.775047 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8671291f19032788b2ff2e949803ea05e361d3d14b5acd1fea0ed431f2883bd"} err="failed to get container status \"c8671291f19032788b2ff2e949803ea05e361d3d14b5acd1fea0ed431f2883bd\": rpc error: code = NotFound desc = could not find container \"c8671291f19032788b2ff2e949803ea05e361d3d14b5acd1fea0ed431f2883bd\": container with ID starting with c8671291f19032788b2ff2e949803ea05e361d3d14b5acd1fea0ed431f2883bd not found: ID does not exist" Dec 08 20:47:44 crc kubenswrapper[4781]: E1208 20:47:44.832356 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3d84d88_4c12_4072_9dd7_df22f28ce6af.slice/crio-48f78aa81e428c0d35cc2a9ffd31cb1943c962f540267d699bf99ecdfbc55230\": RecentStats: unable to find data in memory cache]" Dec 08 20:47:46 crc kubenswrapper[4781]: I1208 20:47:46.136280 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3d84d88-4c12-4072-9dd7-df22f28ce6af" path="/var/lib/kubelet/pods/e3d84d88-4c12-4072-9dd7-df22f28ce6af/volumes" Dec 08 20:47:53 crc kubenswrapper[4781]: I1208 20:47:53.126096 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:47:53 crc kubenswrapper[4781]: E1208 20:47:53.127264 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:48:04 crc kubenswrapper[4781]: I1208 20:48:04.132324 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:48:04 crc kubenswrapper[4781]: E1208 20:48:04.133971 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:48:18 crc kubenswrapper[4781]: I1208 20:48:18.126474 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:48:18 crc kubenswrapper[4781]: E1208 20:48:18.127484 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:48:31 crc kubenswrapper[4781]: I1208 20:48:31.127296 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:48:31 crc kubenswrapper[4781]: E1208 20:48:31.128268 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:48:46 crc kubenswrapper[4781]: I1208 20:48:46.126531 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:48:46 crc kubenswrapper[4781]: E1208 20:48:46.127536 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:48:57 crc kubenswrapper[4781]: I1208 20:48:57.125998 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:48:57 crc kubenswrapper[4781]: E1208 20:48:57.126704 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:48:58 crc kubenswrapper[4781]: I1208 20:48:58.349291 4781 generic.go:334] "Generic (PLEG): container finished" podID="9c03b281-e533-4108-9eff-0930b52141ca" containerID="376b6e2f0f8e239f1b006a1a1ae5a890e0d137fb49f894e3ae88735b8e2eb898" exitCode=0 Dec 08 20:48:58 crc kubenswrapper[4781]: I1208 20:48:58.349370 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" event={"ID":"9c03b281-e533-4108-9eff-0930b52141ca","Type":"ContainerDied","Data":"376b6e2f0f8e239f1b006a1a1ae5a890e0d137fb49f894e3ae88735b8e2eb898"} Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.782645 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.872941 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-0\") pod \"9c03b281-e533-4108-9eff-0930b52141ca\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.873068 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-telemetry-combined-ca-bundle\") pod \"9c03b281-e533-4108-9eff-0930b52141ca\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.873216 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ssh-key\") pod \"9c03b281-e533-4108-9eff-0930b52141ca\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.873300 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwzlb\" (UniqueName: \"kubernetes.io/projected/9c03b281-e533-4108-9eff-0930b52141ca-kube-api-access-kwzlb\") pod \"9c03b281-e533-4108-9eff-0930b52141ca\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.873370 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-2\") pod \"9c03b281-e533-4108-9eff-0930b52141ca\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.873453 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-1\") pod \"9c03b281-e533-4108-9eff-0930b52141ca\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.873490 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-inventory\") pod \"9c03b281-e533-4108-9eff-0930b52141ca\" (UID: \"9c03b281-e533-4108-9eff-0930b52141ca\") " Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.879810 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c03b281-e533-4108-9eff-0930b52141ca-kube-api-access-kwzlb" (OuterVolumeSpecName: "kube-api-access-kwzlb") pod "9c03b281-e533-4108-9eff-0930b52141ca" (UID: "9c03b281-e533-4108-9eff-0930b52141ca"). InnerVolumeSpecName "kube-api-access-kwzlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.880152 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9c03b281-e533-4108-9eff-0930b52141ca" (UID: "9c03b281-e533-4108-9eff-0930b52141ca"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.900335 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "9c03b281-e533-4108-9eff-0930b52141ca" (UID: "9c03b281-e533-4108-9eff-0930b52141ca"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.903575 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9c03b281-e533-4108-9eff-0930b52141ca" (UID: "9c03b281-e533-4108-9eff-0930b52141ca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.909913 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "9c03b281-e533-4108-9eff-0930b52141ca" (UID: "9c03b281-e533-4108-9eff-0930b52141ca"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.911137 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "9c03b281-e533-4108-9eff-0930b52141ca" (UID: "9c03b281-e533-4108-9eff-0930b52141ca"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.912939 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-inventory" (OuterVolumeSpecName: "inventory") pod "9c03b281-e533-4108-9eff-0930b52141ca" (UID: "9c03b281-e533-4108-9eff-0930b52141ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.976711 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.976750 4781 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.976766 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.976778 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwzlb\" (UniqueName: \"kubernetes.io/projected/9c03b281-e533-4108-9eff-0930b52141ca-kube-api-access-kwzlb\") on node \"crc\" DevicePath \"\"" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.976791 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.976803 4781 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 08 20:48:59 crc kubenswrapper[4781]: I1208 20:48:59.976817 4781 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c03b281-e533-4108-9eff-0930b52141ca-inventory\") on node \"crc\" DevicePath \"\"" Dec 08 20:49:00 crc kubenswrapper[4781]: I1208 20:49:00.368991 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" event={"ID":"9c03b281-e533-4108-9eff-0930b52141ca","Type":"ContainerDied","Data":"e8270877385c988c1dc0080c7b9f82195c77a1de1fef4f66b34d96cbb4a3771a"} Dec 08 20:49:00 crc kubenswrapper[4781]: I1208 20:49:00.369280 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8270877385c988c1dc0080c7b9f82195c77a1de1fef4f66b34d96cbb4a3771a" Dec 08 20:49:00 crc kubenswrapper[4781]: I1208 20:49:00.369189 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf" Dec 08 20:49:09 crc kubenswrapper[4781]: I1208 20:49:09.138723 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:49:09 crc kubenswrapper[4781]: E1208 20:49:09.149665 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:49:23 crc kubenswrapper[4781]: I1208 20:49:23.126843 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:49:23 crc kubenswrapper[4781]: E1208 20:49:23.129558 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:49:38 crc kubenswrapper[4781]: I1208 20:49:38.126296 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:49:38 crc kubenswrapper[4781]: E1208 20:49:38.127032 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:49:50 crc kubenswrapper[4781]: I1208 20:49:50.126345 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:49:50 crc kubenswrapper[4781]: E1208 20:49:50.126994 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.766629 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 08 20:49:56 crc kubenswrapper[4781]: E1208 20:49:56.767725 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d84d88-4c12-4072-9dd7-df22f28ce6af" containerName="extract-content" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.767743 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d84d88-4c12-4072-9dd7-df22f28ce6af" containerName="extract-content" Dec 08 20:49:56 crc kubenswrapper[4781]: E1208 20:49:56.767776 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d84d88-4c12-4072-9dd7-df22f28ce6af" containerName="registry-server" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.767784 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d84d88-4c12-4072-9dd7-df22f28ce6af" containerName="registry-server" Dec 08 20:49:56 crc kubenswrapper[4781]: E1208 20:49:56.767798 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c03b281-e533-4108-9eff-0930b52141ca" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.767810 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c03b281-e533-4108-9eff-0930b52141ca" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 08 20:49:56 crc kubenswrapper[4781]: E1208 20:49:56.767825 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3d84d88-4c12-4072-9dd7-df22f28ce6af" containerName="extract-utilities" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.767833 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d84d88-4c12-4072-9dd7-df22f28ce6af" containerName="extract-utilities" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.768140 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c03b281-e533-4108-9eff-0930b52141ca" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.768171 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3d84d88-4c12-4072-9dd7-df22f28ce6af" containerName="registry-server" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.771558 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.776422 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-24hrz" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.777075 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.779687 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.788823 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.802632 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.878206 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.878707 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e2b97db1-1e2a-45e9-b959-fde154131c3b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.878944 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2b97db1-1e2a-45e9-b959-fde154131c3b-config-data\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.879119 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e2b97db1-1e2a-45e9-b959-fde154131c3b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.879352 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.879553 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e2b97db1-1e2a-45e9-b959-fde154131c3b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.879774 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.880773 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.881075 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsq6g\" (UniqueName: \"kubernetes.io/projected/e2b97db1-1e2a-45e9-b959-fde154131c3b-kube-api-access-lsq6g\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.983296 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.983351 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsq6g\" (UniqueName: \"kubernetes.io/projected/e2b97db1-1e2a-45e9-b959-fde154131c3b-kube-api-access-lsq6g\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.983390 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.983423 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e2b97db1-1e2a-45e9-b959-fde154131c3b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.983453 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2b97db1-1e2a-45e9-b959-fde154131c3b-config-data\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.983478 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e2b97db1-1e2a-45e9-b959-fde154131c3b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.983547 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.983577 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e2b97db1-1e2a-45e9-b959-fde154131c3b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.983594 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.984265 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.984857 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e2b97db1-1e2a-45e9-b959-fde154131c3b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.985437 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e2b97db1-1e2a-45e9-b959-fde154131c3b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.985868 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e2b97db1-1e2a-45e9-b959-fde154131c3b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.986629 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2b97db1-1e2a-45e9-b959-fde154131c3b-config-data\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.990056 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.990568 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:56 crc kubenswrapper[4781]: I1208 20:49:56.991292 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:57 crc kubenswrapper[4781]: I1208 20:49:57.001883 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsq6g\" (UniqueName: \"kubernetes.io/projected/e2b97db1-1e2a-45e9-b959-fde154131c3b-kube-api-access-lsq6g\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:57 crc kubenswrapper[4781]: I1208 20:49:57.030491 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " pod="openstack/tempest-tests-tempest" Dec 08 20:49:57 crc kubenswrapper[4781]: I1208 20:49:57.102085 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 08 20:49:57 crc kubenswrapper[4781]: I1208 20:49:57.585940 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 08 20:49:57 crc kubenswrapper[4781]: W1208 20:49:57.587216 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2b97db1_1e2a_45e9_b959_fde154131c3b.slice/crio-f91dd50f1260b239b8df5c85be4416622ea1af2e361485df642b46941feddc8d WatchSource:0}: Error finding container f91dd50f1260b239b8df5c85be4416622ea1af2e361485df642b46941feddc8d: Status 404 returned error can't find the container with id f91dd50f1260b239b8df5c85be4416622ea1af2e361485df642b46941feddc8d Dec 08 20:49:57 crc kubenswrapper[4781]: I1208 20:49:57.589388 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 20:49:58 crc kubenswrapper[4781]: I1208 20:49:58.068482 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e2b97db1-1e2a-45e9-b959-fde154131c3b","Type":"ContainerStarted","Data":"f91dd50f1260b239b8df5c85be4416622ea1af2e361485df642b46941feddc8d"} Dec 08 20:50:02 crc kubenswrapper[4781]: I1208 20:50:02.126654 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:50:37 crc kubenswrapper[4781]: E1208 20:50:37.847017 4781 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 08 20:50:37 crc kubenswrapper[4781]: E1208 20:50:37.847738 4781 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lsq6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(e2b97db1-1e2a-45e9-b959-fde154131c3b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 08 20:50:37 crc kubenswrapper[4781]: E1208 20:50:37.848874 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="e2b97db1-1e2a-45e9-b959-fde154131c3b" Dec 08 20:50:38 crc kubenswrapper[4781]: I1208 20:50:38.526236 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"9f6d09a64c62f4d83ffff4cc9da23e13114399aa52e32752b18f4a6c3d650559"} Dec 08 20:50:38 crc kubenswrapper[4781]: E1208 20:50:38.529523 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="e2b97db1-1e2a-45e9-b959-fde154131c3b" Dec 08 20:50:51 crc kubenswrapper[4781]: I1208 20:50:51.612056 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 08 20:50:52 crc kubenswrapper[4781]: I1208 20:50:52.646447 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e2b97db1-1e2a-45e9-b959-fde154131c3b","Type":"ContainerStarted","Data":"35e14ec4dbbbc04bf381fc6644d4dab25b52641610e2c885f5d121397e5cbdd9"} Dec 08 20:50:52 crc kubenswrapper[4781]: I1208 20:50:52.669850 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.650235685 podStartE2EDuration="57.669832272s" podCreationTimestamp="2025-12-08 20:49:55 +0000 UTC" firstStartedPulling="2025-12-08 20:49:57.589199971 +0000 UTC m=+2713.740483348" lastFinishedPulling="2025-12-08 20:50:51.608796558 +0000 UTC m=+2767.760079935" observedRunningTime="2025-12-08 20:50:52.664387426 +0000 UTC m=+2768.815670803" watchObservedRunningTime="2025-12-08 20:50:52.669832272 +0000 UTC m=+2768.821115649" Dec 08 20:52:59 crc kubenswrapper[4781]: I1208 20:52:59.947746 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:52:59 crc kubenswrapper[4781]: I1208 20:52:59.948547 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:53:29 crc kubenswrapper[4781]: I1208 20:53:29.948398 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:53:29 crc kubenswrapper[4781]: I1208 20:53:29.949337 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:53:59 crc kubenswrapper[4781]: I1208 20:53:59.947796 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:53:59 crc kubenswrapper[4781]: I1208 20:53:59.948422 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:53:59 crc kubenswrapper[4781]: I1208 20:53:59.948498 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:53:59 crc kubenswrapper[4781]: I1208 20:53:59.949697 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f6d09a64c62f4d83ffff4cc9da23e13114399aa52e32752b18f4a6c3d650559"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 20:53:59 crc kubenswrapper[4781]: I1208 20:53:59.949784 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://9f6d09a64c62f4d83ffff4cc9da23e13114399aa52e32752b18f4a6c3d650559" gracePeriod=600 Dec 08 20:54:00 crc kubenswrapper[4781]: I1208 20:54:00.688418 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="9f6d09a64c62f4d83ffff4cc9da23e13114399aa52e32752b18f4a6c3d650559" exitCode=0 Dec 08 20:54:00 crc kubenswrapper[4781]: I1208 20:54:00.688473 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"9f6d09a64c62f4d83ffff4cc9da23e13114399aa52e32752b18f4a6c3d650559"} Dec 08 20:54:00 crc kubenswrapper[4781]: I1208 20:54:00.688998 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469"} Dec 08 20:54:00 crc kubenswrapper[4781]: I1208 20:54:00.689054 4781 scope.go:117] "RemoveContainer" containerID="8667ff5b63e109fc57ed21d3b04fa7302aa20b6b1482a5bcce5c40015dbe9465" Dec 08 20:54:01 crc kubenswrapper[4781]: I1208 20:54:01.549796 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xbwds"] Dec 08 20:54:01 crc kubenswrapper[4781]: I1208 20:54:01.558359 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:01 crc kubenswrapper[4781]: I1208 20:54:01.562604 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xbwds"] Dec 08 20:54:01 crc kubenswrapper[4781]: I1208 20:54:01.649024 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqjbx\" (UniqueName: \"kubernetes.io/projected/4f687464-d0cd-47ad-b29b-b15c35d699a5-kube-api-access-jqjbx\") pod \"redhat-operators-xbwds\" (UID: \"4f687464-d0cd-47ad-b29b-b15c35d699a5\") " pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:01 crc kubenswrapper[4781]: I1208 20:54:01.649204 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f687464-d0cd-47ad-b29b-b15c35d699a5-utilities\") pod \"redhat-operators-xbwds\" (UID: \"4f687464-d0cd-47ad-b29b-b15c35d699a5\") " pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:01 crc kubenswrapper[4781]: I1208 20:54:01.649249 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f687464-d0cd-47ad-b29b-b15c35d699a5-catalog-content\") pod \"redhat-operators-xbwds\" (UID: \"4f687464-d0cd-47ad-b29b-b15c35d699a5\") " pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:01 crc kubenswrapper[4781]: I1208 20:54:01.750849 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f687464-d0cd-47ad-b29b-b15c35d699a5-utilities\") pod \"redhat-operators-xbwds\" (UID: \"4f687464-d0cd-47ad-b29b-b15c35d699a5\") " pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:01 crc kubenswrapper[4781]: I1208 20:54:01.750914 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f687464-d0cd-47ad-b29b-b15c35d699a5-catalog-content\") pod \"redhat-operators-xbwds\" (UID: \"4f687464-d0cd-47ad-b29b-b15c35d699a5\") " pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:01 crc kubenswrapper[4781]: I1208 20:54:01.751009 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqjbx\" (UniqueName: \"kubernetes.io/projected/4f687464-d0cd-47ad-b29b-b15c35d699a5-kube-api-access-jqjbx\") pod \"redhat-operators-xbwds\" (UID: \"4f687464-d0cd-47ad-b29b-b15c35d699a5\") " pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:01 crc kubenswrapper[4781]: I1208 20:54:01.751872 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f687464-d0cd-47ad-b29b-b15c35d699a5-utilities\") pod \"redhat-operators-xbwds\" (UID: \"4f687464-d0cd-47ad-b29b-b15c35d699a5\") " pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:01 crc kubenswrapper[4781]: I1208 20:54:01.752181 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f687464-d0cd-47ad-b29b-b15c35d699a5-catalog-content\") pod \"redhat-operators-xbwds\" (UID: \"4f687464-d0cd-47ad-b29b-b15c35d699a5\") " pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:01 crc kubenswrapper[4781]: I1208 20:54:01.771304 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqjbx\" (UniqueName: \"kubernetes.io/projected/4f687464-d0cd-47ad-b29b-b15c35d699a5-kube-api-access-jqjbx\") pod \"redhat-operators-xbwds\" (UID: \"4f687464-d0cd-47ad-b29b-b15c35d699a5\") " pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:01 crc kubenswrapper[4781]: I1208 20:54:01.885245 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:02 crc kubenswrapper[4781]: I1208 20:54:02.371823 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xbwds"] Dec 08 20:54:02 crc kubenswrapper[4781]: I1208 20:54:02.711627 4781 generic.go:334] "Generic (PLEG): container finished" podID="4f687464-d0cd-47ad-b29b-b15c35d699a5" containerID="9f238ebd8ff81bd8fbf9d4d2c1ddacff500294b29e269460edd9557ccdc8b8bb" exitCode=0 Dec 08 20:54:02 crc kubenswrapper[4781]: I1208 20:54:02.711677 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbwds" event={"ID":"4f687464-d0cd-47ad-b29b-b15c35d699a5","Type":"ContainerDied","Data":"9f238ebd8ff81bd8fbf9d4d2c1ddacff500294b29e269460edd9557ccdc8b8bb"} Dec 08 20:54:02 crc kubenswrapper[4781]: I1208 20:54:02.711727 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbwds" event={"ID":"4f687464-d0cd-47ad-b29b-b15c35d699a5","Type":"ContainerStarted","Data":"93bfcad201ed7bfa86de4bac6460a83d12cddb00620fb4b7971059a7b1e02745"} Dec 08 20:54:03 crc kubenswrapper[4781]: I1208 20:54:03.722278 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbwds" event={"ID":"4f687464-d0cd-47ad-b29b-b15c35d699a5","Type":"ContainerStarted","Data":"c28041b959b820305d305e494b676db5e422cdfba3c45c0dad33eb5c41a00a4f"} Dec 08 20:54:05 crc kubenswrapper[4781]: I1208 20:54:05.740663 4781 generic.go:334] "Generic (PLEG): container finished" podID="4f687464-d0cd-47ad-b29b-b15c35d699a5" containerID="c28041b959b820305d305e494b676db5e422cdfba3c45c0dad33eb5c41a00a4f" exitCode=0 Dec 08 20:54:05 crc kubenswrapper[4781]: I1208 20:54:05.740770 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbwds" event={"ID":"4f687464-d0cd-47ad-b29b-b15c35d699a5","Type":"ContainerDied","Data":"c28041b959b820305d305e494b676db5e422cdfba3c45c0dad33eb5c41a00a4f"} Dec 08 20:54:06 crc kubenswrapper[4781]: I1208 20:54:06.756046 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbwds" event={"ID":"4f687464-d0cd-47ad-b29b-b15c35d699a5","Type":"ContainerStarted","Data":"227a4f8fb4bace7b7c6c557305a069c67baaac91592d6b25566c61ae852f0371"} Dec 08 20:54:06 crc kubenswrapper[4781]: I1208 20:54:06.776239 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xbwds" podStartSLOduration=2.133216324 podStartE2EDuration="5.776222322s" podCreationTimestamp="2025-12-08 20:54:01 +0000 UTC" firstStartedPulling="2025-12-08 20:54:02.713323536 +0000 UTC m=+2958.864606913" lastFinishedPulling="2025-12-08 20:54:06.356329534 +0000 UTC m=+2962.507612911" observedRunningTime="2025-12-08 20:54:06.77336062 +0000 UTC m=+2962.924643997" watchObservedRunningTime="2025-12-08 20:54:06.776222322 +0000 UTC m=+2962.927505699" Dec 08 20:54:11 crc kubenswrapper[4781]: I1208 20:54:11.885480 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:11 crc kubenswrapper[4781]: I1208 20:54:11.887219 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:12 crc kubenswrapper[4781]: I1208 20:54:12.946031 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xbwds" podUID="4f687464-d0cd-47ad-b29b-b15c35d699a5" containerName="registry-server" probeResult="failure" output=< Dec 08 20:54:12 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 08 20:54:12 crc kubenswrapper[4781]: > Dec 08 20:54:21 crc kubenswrapper[4781]: I1208 20:54:21.950558 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:22 crc kubenswrapper[4781]: I1208 20:54:22.015218 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:22 crc kubenswrapper[4781]: I1208 20:54:22.199090 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xbwds"] Dec 08 20:54:23 crc kubenswrapper[4781]: I1208 20:54:23.055665 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xbwds" podUID="4f687464-d0cd-47ad-b29b-b15c35d699a5" containerName="registry-server" containerID="cri-o://227a4f8fb4bace7b7c6c557305a069c67baaac91592d6b25566c61ae852f0371" gracePeriod=2 Dec 08 20:54:23 crc kubenswrapper[4781]: I1208 20:54:23.578985 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:23 crc kubenswrapper[4781]: I1208 20:54:23.702100 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f687464-d0cd-47ad-b29b-b15c35d699a5-utilities\") pod \"4f687464-d0cd-47ad-b29b-b15c35d699a5\" (UID: \"4f687464-d0cd-47ad-b29b-b15c35d699a5\") " Dec 08 20:54:23 crc kubenswrapper[4781]: I1208 20:54:23.702252 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqjbx\" (UniqueName: \"kubernetes.io/projected/4f687464-d0cd-47ad-b29b-b15c35d699a5-kube-api-access-jqjbx\") pod \"4f687464-d0cd-47ad-b29b-b15c35d699a5\" (UID: \"4f687464-d0cd-47ad-b29b-b15c35d699a5\") " Dec 08 20:54:23 crc kubenswrapper[4781]: I1208 20:54:23.702285 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f687464-d0cd-47ad-b29b-b15c35d699a5-catalog-content\") pod \"4f687464-d0cd-47ad-b29b-b15c35d699a5\" (UID: \"4f687464-d0cd-47ad-b29b-b15c35d699a5\") " Dec 08 20:54:23 crc kubenswrapper[4781]: I1208 20:54:23.703074 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f687464-d0cd-47ad-b29b-b15c35d699a5-utilities" (OuterVolumeSpecName: "utilities") pod "4f687464-d0cd-47ad-b29b-b15c35d699a5" (UID: "4f687464-d0cd-47ad-b29b-b15c35d699a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:54:23 crc kubenswrapper[4781]: I1208 20:54:23.709164 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f687464-d0cd-47ad-b29b-b15c35d699a5-kube-api-access-jqjbx" (OuterVolumeSpecName: "kube-api-access-jqjbx") pod "4f687464-d0cd-47ad-b29b-b15c35d699a5" (UID: "4f687464-d0cd-47ad-b29b-b15c35d699a5"). InnerVolumeSpecName "kube-api-access-jqjbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:54:23 crc kubenswrapper[4781]: I1208 20:54:23.804148 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f687464-d0cd-47ad-b29b-b15c35d699a5-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:54:23 crc kubenswrapper[4781]: I1208 20:54:23.804179 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqjbx\" (UniqueName: \"kubernetes.io/projected/4f687464-d0cd-47ad-b29b-b15c35d699a5-kube-api-access-jqjbx\") on node \"crc\" DevicePath \"\"" Dec 08 20:54:23 crc kubenswrapper[4781]: I1208 20:54:23.820612 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f687464-d0cd-47ad-b29b-b15c35d699a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f687464-d0cd-47ad-b29b-b15c35d699a5" (UID: "4f687464-d0cd-47ad-b29b-b15c35d699a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:54:23 crc kubenswrapper[4781]: I1208 20:54:23.906018 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f687464-d0cd-47ad-b29b-b15c35d699a5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.068014 4781 generic.go:334] "Generic (PLEG): container finished" podID="4f687464-d0cd-47ad-b29b-b15c35d699a5" containerID="227a4f8fb4bace7b7c6c557305a069c67baaac91592d6b25566c61ae852f0371" exitCode=0 Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.068067 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbwds" event={"ID":"4f687464-d0cd-47ad-b29b-b15c35d699a5","Type":"ContainerDied","Data":"227a4f8fb4bace7b7c6c557305a069c67baaac91592d6b25566c61ae852f0371"} Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.068106 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbwds" event={"ID":"4f687464-d0cd-47ad-b29b-b15c35d699a5","Type":"ContainerDied","Data":"93bfcad201ed7bfa86de4bac6460a83d12cddb00620fb4b7971059a7b1e02745"} Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.068146 4781 scope.go:117] "RemoveContainer" containerID="227a4f8fb4bace7b7c6c557305a069c67baaac91592d6b25566c61ae852f0371" Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.068191 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbwds" Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.094968 4781 scope.go:117] "RemoveContainer" containerID="c28041b959b820305d305e494b676db5e422cdfba3c45c0dad33eb5c41a00a4f" Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.112414 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xbwds"] Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.122862 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xbwds"] Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.139047 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f687464-d0cd-47ad-b29b-b15c35d699a5" path="/var/lib/kubelet/pods/4f687464-d0cd-47ad-b29b-b15c35d699a5/volumes" Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.143975 4781 scope.go:117] "RemoveContainer" containerID="9f238ebd8ff81bd8fbf9d4d2c1ddacff500294b29e269460edd9557ccdc8b8bb" Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.168184 4781 scope.go:117] "RemoveContainer" containerID="227a4f8fb4bace7b7c6c557305a069c67baaac91592d6b25566c61ae852f0371" Dec 08 20:54:24 crc kubenswrapper[4781]: E1208 20:54:24.169896 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"227a4f8fb4bace7b7c6c557305a069c67baaac91592d6b25566c61ae852f0371\": container with ID starting with 227a4f8fb4bace7b7c6c557305a069c67baaac91592d6b25566c61ae852f0371 not found: ID does not exist" containerID="227a4f8fb4bace7b7c6c557305a069c67baaac91592d6b25566c61ae852f0371" Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.170014 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"227a4f8fb4bace7b7c6c557305a069c67baaac91592d6b25566c61ae852f0371"} err="failed to get container status \"227a4f8fb4bace7b7c6c557305a069c67baaac91592d6b25566c61ae852f0371\": rpc error: code = NotFound desc = could not find container \"227a4f8fb4bace7b7c6c557305a069c67baaac91592d6b25566c61ae852f0371\": container with ID starting with 227a4f8fb4bace7b7c6c557305a069c67baaac91592d6b25566c61ae852f0371 not found: ID does not exist" Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.170047 4781 scope.go:117] "RemoveContainer" containerID="c28041b959b820305d305e494b676db5e422cdfba3c45c0dad33eb5c41a00a4f" Dec 08 20:54:24 crc kubenswrapper[4781]: E1208 20:54:24.174253 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c28041b959b820305d305e494b676db5e422cdfba3c45c0dad33eb5c41a00a4f\": container with ID starting with c28041b959b820305d305e494b676db5e422cdfba3c45c0dad33eb5c41a00a4f not found: ID does not exist" containerID="c28041b959b820305d305e494b676db5e422cdfba3c45c0dad33eb5c41a00a4f" Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.174292 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28041b959b820305d305e494b676db5e422cdfba3c45c0dad33eb5c41a00a4f"} err="failed to get container status \"c28041b959b820305d305e494b676db5e422cdfba3c45c0dad33eb5c41a00a4f\": rpc error: code = NotFound desc = could not find container \"c28041b959b820305d305e494b676db5e422cdfba3c45c0dad33eb5c41a00a4f\": container with ID starting with c28041b959b820305d305e494b676db5e422cdfba3c45c0dad33eb5c41a00a4f not found: ID does not exist" Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.174315 4781 scope.go:117] "RemoveContainer" containerID="9f238ebd8ff81bd8fbf9d4d2c1ddacff500294b29e269460edd9557ccdc8b8bb" Dec 08 20:54:24 crc kubenswrapper[4781]: E1208 20:54:24.176335 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f238ebd8ff81bd8fbf9d4d2c1ddacff500294b29e269460edd9557ccdc8b8bb\": container with ID starting with 9f238ebd8ff81bd8fbf9d4d2c1ddacff500294b29e269460edd9557ccdc8b8bb not found: ID does not exist" containerID="9f238ebd8ff81bd8fbf9d4d2c1ddacff500294b29e269460edd9557ccdc8b8bb" Dec 08 20:54:24 crc kubenswrapper[4781]: I1208 20:54:24.176361 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f238ebd8ff81bd8fbf9d4d2c1ddacff500294b29e269460edd9557ccdc8b8bb"} err="failed to get container status \"9f238ebd8ff81bd8fbf9d4d2c1ddacff500294b29e269460edd9557ccdc8b8bb\": rpc error: code = NotFound desc = could not find container \"9f238ebd8ff81bd8fbf9d4d2c1ddacff500294b29e269460edd9557ccdc8b8bb\": container with ID starting with 9f238ebd8ff81bd8fbf9d4d2c1ddacff500294b29e269460edd9557ccdc8b8bb not found: ID does not exist" Dec 08 20:56:29 crc kubenswrapper[4781]: I1208 20:56:29.948768 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:56:29 crc kubenswrapper[4781]: I1208 20:56:29.949398 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:56:59 crc kubenswrapper[4781]: I1208 20:56:59.948800 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:56:59 crc kubenswrapper[4781]: I1208 20:56:59.949568 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:57:29 crc kubenswrapper[4781]: I1208 20:57:29.947621 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 20:57:29 crc kubenswrapper[4781]: I1208 20:57:29.948194 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 20:57:29 crc kubenswrapper[4781]: I1208 20:57:29.948238 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 20:57:29 crc kubenswrapper[4781]: I1208 20:57:29.948934 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 20:57:29 crc kubenswrapper[4781]: I1208 20:57:29.948983 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" gracePeriod=600 Dec 08 20:57:30 crc kubenswrapper[4781]: E1208 20:57:30.073129 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:57:30 crc kubenswrapper[4781]: I1208 20:57:30.921628 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" exitCode=0 Dec 08 20:57:30 crc kubenswrapper[4781]: I1208 20:57:30.921739 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469"} Dec 08 20:57:30 crc kubenswrapper[4781]: I1208 20:57:30.922037 4781 scope.go:117] "RemoveContainer" containerID="9f6d09a64c62f4d83ffff4cc9da23e13114399aa52e32752b18f4a6c3d650559" Dec 08 20:57:30 crc kubenswrapper[4781]: I1208 20:57:30.922805 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 20:57:30 crc kubenswrapper[4781]: E1208 20:57:30.923262 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:57:43 crc kubenswrapper[4781]: I1208 20:57:43.126052 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 20:57:43 crc kubenswrapper[4781]: E1208 20:57:43.126989 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:57:56 crc kubenswrapper[4781]: I1208 20:57:56.127295 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 20:57:56 crc kubenswrapper[4781]: E1208 20:57:56.128452 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:58:08 crc kubenswrapper[4781]: I1208 20:58:08.126809 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 20:58:08 crc kubenswrapper[4781]: E1208 20:58:08.127891 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:58:17 crc kubenswrapper[4781]: I1208 20:58:17.794311 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n4f4n"] Dec 08 20:58:17 crc kubenswrapper[4781]: E1208 20:58:17.798966 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f687464-d0cd-47ad-b29b-b15c35d699a5" containerName="extract-utilities" Dec 08 20:58:17 crc kubenswrapper[4781]: I1208 20:58:17.798987 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f687464-d0cd-47ad-b29b-b15c35d699a5" containerName="extract-utilities" Dec 08 20:58:17 crc kubenswrapper[4781]: E1208 20:58:17.799023 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f687464-d0cd-47ad-b29b-b15c35d699a5" containerName="registry-server" Dec 08 20:58:17 crc kubenswrapper[4781]: I1208 20:58:17.799061 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f687464-d0cd-47ad-b29b-b15c35d699a5" containerName="registry-server" Dec 08 20:58:17 crc kubenswrapper[4781]: E1208 20:58:17.799084 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f687464-d0cd-47ad-b29b-b15c35d699a5" containerName="extract-content" Dec 08 20:58:17 crc kubenswrapper[4781]: I1208 20:58:17.799092 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f687464-d0cd-47ad-b29b-b15c35d699a5" containerName="extract-content" Dec 08 20:58:17 crc kubenswrapper[4781]: I1208 20:58:17.799418 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f687464-d0cd-47ad-b29b-b15c35d699a5" containerName="registry-server" Dec 08 20:58:17 crc kubenswrapper[4781]: I1208 20:58:17.802931 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:17 crc kubenswrapper[4781]: I1208 20:58:17.809430 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n4f4n"] Dec 08 20:58:17 crc kubenswrapper[4781]: I1208 20:58:17.926414 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-catalog-content\") pod \"certified-operators-n4f4n\" (UID: \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\") " pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:17 crc kubenswrapper[4781]: I1208 20:58:17.926632 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-utilities\") pod \"certified-operators-n4f4n\" (UID: \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\") " pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:17 crc kubenswrapper[4781]: I1208 20:58:17.926784 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjjj6\" (UniqueName: \"kubernetes.io/projected/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-kube-api-access-rjjj6\") pod \"certified-operators-n4f4n\" (UID: \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\") " pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:18 crc kubenswrapper[4781]: I1208 20:58:18.029105 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-utilities\") pod \"certified-operators-n4f4n\" (UID: \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\") " pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:18 crc kubenswrapper[4781]: I1208 20:58:18.029177 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjjj6\" (UniqueName: \"kubernetes.io/projected/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-kube-api-access-rjjj6\") pod \"certified-operators-n4f4n\" (UID: \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\") " pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:18 crc kubenswrapper[4781]: I1208 20:58:18.029268 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-catalog-content\") pod \"certified-operators-n4f4n\" (UID: \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\") " pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:18 crc kubenswrapper[4781]: I1208 20:58:18.029758 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-catalog-content\") pod \"certified-operators-n4f4n\" (UID: \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\") " pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:18 crc kubenswrapper[4781]: I1208 20:58:18.029979 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-utilities\") pod \"certified-operators-n4f4n\" (UID: \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\") " pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:18 crc kubenswrapper[4781]: I1208 20:58:18.059197 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjjj6\" (UniqueName: \"kubernetes.io/projected/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-kube-api-access-rjjj6\") pod \"certified-operators-n4f4n\" (UID: \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\") " pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:18 crc kubenswrapper[4781]: I1208 20:58:18.135930 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:18 crc kubenswrapper[4781]: I1208 20:58:18.604416 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n4f4n"] Dec 08 20:58:19 crc kubenswrapper[4781]: I1208 20:58:19.535172 4781 generic.go:334] "Generic (PLEG): container finished" podID="ea193594-36cc-4304-a70b-bd9bb5d1c4ad" containerID="33899bb85640fc3cde5513b74be474c7d4476ecd9075abf9ba7cea1c5716a766" exitCode=0 Dec 08 20:58:19 crc kubenswrapper[4781]: I1208 20:58:19.535314 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n4f4n" event={"ID":"ea193594-36cc-4304-a70b-bd9bb5d1c4ad","Type":"ContainerDied","Data":"33899bb85640fc3cde5513b74be474c7d4476ecd9075abf9ba7cea1c5716a766"} Dec 08 20:58:19 crc kubenswrapper[4781]: I1208 20:58:19.535545 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n4f4n" event={"ID":"ea193594-36cc-4304-a70b-bd9bb5d1c4ad","Type":"ContainerStarted","Data":"696233be2f873ae7bf316ba38e3b0c72076f1caa48ad61aa03ca8f802c34dbc8"} Dec 08 20:58:19 crc kubenswrapper[4781]: I1208 20:58:19.536977 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 20:58:20 crc kubenswrapper[4781]: I1208 20:58:20.548721 4781 generic.go:334] "Generic (PLEG): container finished" podID="ea193594-36cc-4304-a70b-bd9bb5d1c4ad" containerID="e180df879546ed510768a8143c264e48c08fe04e0b422b0bc116a5f3ebd4ff71" exitCode=0 Dec 08 20:58:20 crc kubenswrapper[4781]: I1208 20:58:20.548802 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n4f4n" event={"ID":"ea193594-36cc-4304-a70b-bd9bb5d1c4ad","Type":"ContainerDied","Data":"e180df879546ed510768a8143c264e48c08fe04e0b422b0bc116a5f3ebd4ff71"} Dec 08 20:58:21 crc kubenswrapper[4781]: I1208 20:58:21.126301 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 20:58:21 crc kubenswrapper[4781]: E1208 20:58:21.126500 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:58:21 crc kubenswrapper[4781]: I1208 20:58:21.561809 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n4f4n" event={"ID":"ea193594-36cc-4304-a70b-bd9bb5d1c4ad","Type":"ContainerStarted","Data":"21107e4558fb28338540516fd485c2c05b9852fc2637175c836177544d00f445"} Dec 08 20:58:21 crc kubenswrapper[4781]: I1208 20:58:21.583138 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n4f4n" podStartSLOduration=3.15838291 podStartE2EDuration="4.583119506s" podCreationTimestamp="2025-12-08 20:58:17 +0000 UTC" firstStartedPulling="2025-12-08 20:58:19.536678533 +0000 UTC m=+3215.687961910" lastFinishedPulling="2025-12-08 20:58:20.961415129 +0000 UTC m=+3217.112698506" observedRunningTime="2025-12-08 20:58:21.57941192 +0000 UTC m=+3217.730695287" watchObservedRunningTime="2025-12-08 20:58:21.583119506 +0000 UTC m=+3217.734402883" Dec 08 20:58:28 crc kubenswrapper[4781]: I1208 20:58:28.136899 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:28 crc kubenswrapper[4781]: I1208 20:58:28.138390 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:28 crc kubenswrapper[4781]: I1208 20:58:28.194001 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:28 crc kubenswrapper[4781]: I1208 20:58:28.671599 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:28 crc kubenswrapper[4781]: I1208 20:58:28.724256 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n4f4n"] Dec 08 20:58:30 crc kubenswrapper[4781]: I1208 20:58:30.640446 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n4f4n" podUID="ea193594-36cc-4304-a70b-bd9bb5d1c4ad" containerName="registry-server" containerID="cri-o://21107e4558fb28338540516fd485c2c05b9852fc2637175c836177544d00f445" gracePeriod=2 Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.176403 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.215646 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjjj6\" (UniqueName: \"kubernetes.io/projected/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-kube-api-access-rjjj6\") pod \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\" (UID: \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\") " Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.215800 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-utilities\") pod \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\" (UID: \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\") " Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.215909 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-catalog-content\") pod \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\" (UID: \"ea193594-36cc-4304-a70b-bd9bb5d1c4ad\") " Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.219367 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-utilities" (OuterVolumeSpecName: "utilities") pod "ea193594-36cc-4304-a70b-bd9bb5d1c4ad" (UID: "ea193594-36cc-4304-a70b-bd9bb5d1c4ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.223194 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-kube-api-access-rjjj6" (OuterVolumeSpecName: "kube-api-access-rjjj6") pod "ea193594-36cc-4304-a70b-bd9bb5d1c4ad" (UID: "ea193594-36cc-4304-a70b-bd9bb5d1c4ad"). InnerVolumeSpecName "kube-api-access-rjjj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.318329 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.318377 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjjj6\" (UniqueName: \"kubernetes.io/projected/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-kube-api-access-rjjj6\") on node \"crc\" DevicePath \"\"" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.399839 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea193594-36cc-4304-a70b-bd9bb5d1c4ad" (UID: "ea193594-36cc-4304-a70b-bd9bb5d1c4ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.419396 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea193594-36cc-4304-a70b-bd9bb5d1c4ad-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.650997 4781 generic.go:334] "Generic (PLEG): container finished" podID="ea193594-36cc-4304-a70b-bd9bb5d1c4ad" containerID="21107e4558fb28338540516fd485c2c05b9852fc2637175c836177544d00f445" exitCode=0 Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.651103 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n4f4n" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.651159 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n4f4n" event={"ID":"ea193594-36cc-4304-a70b-bd9bb5d1c4ad","Type":"ContainerDied","Data":"21107e4558fb28338540516fd485c2c05b9852fc2637175c836177544d00f445"} Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.651501 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n4f4n" event={"ID":"ea193594-36cc-4304-a70b-bd9bb5d1c4ad","Type":"ContainerDied","Data":"696233be2f873ae7bf316ba38e3b0c72076f1caa48ad61aa03ca8f802c34dbc8"} Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.651543 4781 scope.go:117] "RemoveContainer" containerID="21107e4558fb28338540516fd485c2c05b9852fc2637175c836177544d00f445" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.673641 4781 scope.go:117] "RemoveContainer" containerID="e180df879546ed510768a8143c264e48c08fe04e0b422b0bc116a5f3ebd4ff71" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.688169 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n4f4n"] Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.698021 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n4f4n"] Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.711330 4781 scope.go:117] "RemoveContainer" containerID="33899bb85640fc3cde5513b74be474c7d4476ecd9075abf9ba7cea1c5716a766" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.741271 4781 scope.go:117] "RemoveContainer" containerID="21107e4558fb28338540516fd485c2c05b9852fc2637175c836177544d00f445" Dec 08 20:58:31 crc kubenswrapper[4781]: E1208 20:58:31.741809 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21107e4558fb28338540516fd485c2c05b9852fc2637175c836177544d00f445\": container with ID starting with 21107e4558fb28338540516fd485c2c05b9852fc2637175c836177544d00f445 not found: ID does not exist" containerID="21107e4558fb28338540516fd485c2c05b9852fc2637175c836177544d00f445" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.741849 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21107e4558fb28338540516fd485c2c05b9852fc2637175c836177544d00f445"} err="failed to get container status \"21107e4558fb28338540516fd485c2c05b9852fc2637175c836177544d00f445\": rpc error: code = NotFound desc = could not find container \"21107e4558fb28338540516fd485c2c05b9852fc2637175c836177544d00f445\": container with ID starting with 21107e4558fb28338540516fd485c2c05b9852fc2637175c836177544d00f445 not found: ID does not exist" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.741869 4781 scope.go:117] "RemoveContainer" containerID="e180df879546ed510768a8143c264e48c08fe04e0b422b0bc116a5f3ebd4ff71" Dec 08 20:58:31 crc kubenswrapper[4781]: E1208 20:58:31.742351 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e180df879546ed510768a8143c264e48c08fe04e0b422b0bc116a5f3ebd4ff71\": container with ID starting with e180df879546ed510768a8143c264e48c08fe04e0b422b0bc116a5f3ebd4ff71 not found: ID does not exist" containerID="e180df879546ed510768a8143c264e48c08fe04e0b422b0bc116a5f3ebd4ff71" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.742377 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e180df879546ed510768a8143c264e48c08fe04e0b422b0bc116a5f3ebd4ff71"} err="failed to get container status \"e180df879546ed510768a8143c264e48c08fe04e0b422b0bc116a5f3ebd4ff71\": rpc error: code = NotFound desc = could not find container \"e180df879546ed510768a8143c264e48c08fe04e0b422b0bc116a5f3ebd4ff71\": container with ID starting with e180df879546ed510768a8143c264e48c08fe04e0b422b0bc116a5f3ebd4ff71 not found: ID does not exist" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.742392 4781 scope.go:117] "RemoveContainer" containerID="33899bb85640fc3cde5513b74be474c7d4476ecd9075abf9ba7cea1c5716a766" Dec 08 20:58:31 crc kubenswrapper[4781]: E1208 20:58:31.742619 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33899bb85640fc3cde5513b74be474c7d4476ecd9075abf9ba7cea1c5716a766\": container with ID starting with 33899bb85640fc3cde5513b74be474c7d4476ecd9075abf9ba7cea1c5716a766 not found: ID does not exist" containerID="33899bb85640fc3cde5513b74be474c7d4476ecd9075abf9ba7cea1c5716a766" Dec 08 20:58:31 crc kubenswrapper[4781]: I1208 20:58:31.742667 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33899bb85640fc3cde5513b74be474c7d4476ecd9075abf9ba7cea1c5716a766"} err="failed to get container status \"33899bb85640fc3cde5513b74be474c7d4476ecd9075abf9ba7cea1c5716a766\": rpc error: code = NotFound desc = could not find container \"33899bb85640fc3cde5513b74be474c7d4476ecd9075abf9ba7cea1c5716a766\": container with ID starting with 33899bb85640fc3cde5513b74be474c7d4476ecd9075abf9ba7cea1c5716a766 not found: ID does not exist" Dec 08 20:58:32 crc kubenswrapper[4781]: I1208 20:58:32.148599 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea193594-36cc-4304-a70b-bd9bb5d1c4ad" path="/var/lib/kubelet/pods/ea193594-36cc-4304-a70b-bd9bb5d1c4ad/volumes" Dec 08 20:58:34 crc kubenswrapper[4781]: I1208 20:58:34.131903 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 20:58:34 crc kubenswrapper[4781]: E1208 20:58:34.132507 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:58:45 crc kubenswrapper[4781]: I1208 20:58:45.125864 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 20:58:45 crc kubenswrapper[4781]: E1208 20:58:45.126790 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:58:56 crc kubenswrapper[4781]: I1208 20:58:56.126229 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 20:58:56 crc kubenswrapper[4781]: E1208 20:58:56.127385 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:59:08 crc kubenswrapper[4781]: I1208 20:59:08.126727 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 20:59:08 crc kubenswrapper[4781]: E1208 20:59:08.127294 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:59:23 crc kubenswrapper[4781]: I1208 20:59:23.126257 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 20:59:23 crc kubenswrapper[4781]: E1208 20:59:23.127375 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:59:34 crc kubenswrapper[4781]: I1208 20:59:34.135388 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 20:59:34 crc kubenswrapper[4781]: E1208 20:59:34.136285 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:59:48 crc kubenswrapper[4781]: I1208 20:59:48.125992 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 20:59:48 crc kubenswrapper[4781]: E1208 20:59:48.126865 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 20:59:53 crc kubenswrapper[4781]: I1208 20:59:53.936117 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qdmzv"] Dec 08 20:59:53 crc kubenswrapper[4781]: E1208 20:59:53.937710 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea193594-36cc-4304-a70b-bd9bb5d1c4ad" containerName="extract-content" Dec 08 20:59:53 crc kubenswrapper[4781]: I1208 20:59:53.937733 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea193594-36cc-4304-a70b-bd9bb5d1c4ad" containerName="extract-content" Dec 08 20:59:53 crc kubenswrapper[4781]: E1208 20:59:53.937770 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea193594-36cc-4304-a70b-bd9bb5d1c4ad" containerName="extract-utilities" Dec 08 20:59:53 crc kubenswrapper[4781]: I1208 20:59:53.937778 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea193594-36cc-4304-a70b-bd9bb5d1c4ad" containerName="extract-utilities" Dec 08 20:59:53 crc kubenswrapper[4781]: E1208 20:59:53.937810 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea193594-36cc-4304-a70b-bd9bb5d1c4ad" containerName="registry-server" Dec 08 20:59:53 crc kubenswrapper[4781]: I1208 20:59:53.937817 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea193594-36cc-4304-a70b-bd9bb5d1c4ad" containerName="registry-server" Dec 08 20:59:53 crc kubenswrapper[4781]: I1208 20:59:53.938335 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea193594-36cc-4304-a70b-bd9bb5d1c4ad" containerName="registry-server" Dec 08 20:59:53 crc kubenswrapper[4781]: I1208 20:59:53.941485 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdmzv" Dec 08 20:59:53 crc kubenswrapper[4781]: I1208 20:59:53.957725 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qdmzv"] Dec 08 20:59:54 crc kubenswrapper[4781]: I1208 20:59:54.108367 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-catalog-content\") pod \"community-operators-qdmzv\" (UID: \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\") " pod="openshift-marketplace/community-operators-qdmzv" Dec 08 20:59:54 crc kubenswrapper[4781]: I1208 20:59:54.108885 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-utilities\") pod \"community-operators-qdmzv\" (UID: \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\") " pod="openshift-marketplace/community-operators-qdmzv" Dec 08 20:59:54 crc kubenswrapper[4781]: I1208 20:59:54.109020 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bzm9\" (UniqueName: \"kubernetes.io/projected/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-kube-api-access-7bzm9\") pod \"community-operators-qdmzv\" (UID: \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\") " pod="openshift-marketplace/community-operators-qdmzv" Dec 08 20:59:54 crc kubenswrapper[4781]: I1208 20:59:54.211054 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-utilities\") pod \"community-operators-qdmzv\" (UID: \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\") " pod="openshift-marketplace/community-operators-qdmzv" Dec 08 20:59:54 crc kubenswrapper[4781]: I1208 20:59:54.211453 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bzm9\" (UniqueName: \"kubernetes.io/projected/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-kube-api-access-7bzm9\") pod \"community-operators-qdmzv\" (UID: \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\") " pod="openshift-marketplace/community-operators-qdmzv" Dec 08 20:59:54 crc kubenswrapper[4781]: I1208 20:59:54.211554 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-utilities\") pod \"community-operators-qdmzv\" (UID: \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\") " pod="openshift-marketplace/community-operators-qdmzv" Dec 08 20:59:54 crc kubenswrapper[4781]: I1208 20:59:54.211707 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-catalog-content\") pod \"community-operators-qdmzv\" (UID: \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\") " pod="openshift-marketplace/community-operators-qdmzv" Dec 08 20:59:54 crc kubenswrapper[4781]: I1208 20:59:54.212044 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-catalog-content\") pod \"community-operators-qdmzv\" (UID: \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\") " pod="openshift-marketplace/community-operators-qdmzv" Dec 08 20:59:54 crc kubenswrapper[4781]: I1208 20:59:54.232123 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bzm9\" (UniqueName: \"kubernetes.io/projected/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-kube-api-access-7bzm9\") pod \"community-operators-qdmzv\" (UID: \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\") " pod="openshift-marketplace/community-operators-qdmzv" Dec 08 20:59:54 crc kubenswrapper[4781]: I1208 20:59:54.287252 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdmzv" Dec 08 20:59:54 crc kubenswrapper[4781]: I1208 20:59:54.803347 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qdmzv"] Dec 08 20:59:55 crc kubenswrapper[4781]: I1208 20:59:55.424708 4781 generic.go:334] "Generic (PLEG): container finished" podID="8b08acad-6664-4fd4-b958-e1fb5d34b7e9" containerID="3f340a8d155231c8ecf4631ab1c2700c96131efa627199e11eaceff95ec50b55" exitCode=0 Dec 08 20:59:55 crc kubenswrapper[4781]: I1208 20:59:55.424816 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdmzv" event={"ID":"8b08acad-6664-4fd4-b958-e1fb5d34b7e9","Type":"ContainerDied","Data":"3f340a8d155231c8ecf4631ab1c2700c96131efa627199e11eaceff95ec50b55"} Dec 08 20:59:55 crc kubenswrapper[4781]: I1208 20:59:55.425047 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdmzv" event={"ID":"8b08acad-6664-4fd4-b958-e1fb5d34b7e9","Type":"ContainerStarted","Data":"253279ac1fb5010c2ea8edf01f7ad21eeb59c27dcc567e36ddc0cb1c3f1d12f7"} Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.322713 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dxm55"] Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.325495 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.340903 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dxm55"] Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.349618 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-catalog-content\") pod \"redhat-marketplace-dxm55\" (UID: \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\") " pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.349784 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2gkz\" (UniqueName: \"kubernetes.io/projected/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-kube-api-access-c2gkz\") pod \"redhat-marketplace-dxm55\" (UID: \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\") " pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.349872 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-utilities\") pod \"redhat-marketplace-dxm55\" (UID: \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\") " pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.434649 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdmzv" event={"ID":"8b08acad-6664-4fd4-b958-e1fb5d34b7e9","Type":"ContainerStarted","Data":"2b485db77a4a47964409f952e958afa9d6d581e0e73003c28005ff7d446e0bda"} Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.452294 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-utilities\") pod \"redhat-marketplace-dxm55\" (UID: \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\") " pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.453971 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-catalog-content\") pod \"redhat-marketplace-dxm55\" (UID: \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\") " pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.454765 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2gkz\" (UniqueName: \"kubernetes.io/projected/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-kube-api-access-c2gkz\") pod \"redhat-marketplace-dxm55\" (UID: \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\") " pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.454379 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-catalog-content\") pod \"redhat-marketplace-dxm55\" (UID: \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\") " pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.453099 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-utilities\") pod \"redhat-marketplace-dxm55\" (UID: \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\") " pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.478172 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2gkz\" (UniqueName: \"kubernetes.io/projected/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-kube-api-access-c2gkz\") pod \"redhat-marketplace-dxm55\" (UID: \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\") " pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 20:59:56 crc kubenswrapper[4781]: I1208 20:59:56.650573 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 20:59:57 crc kubenswrapper[4781]: I1208 20:59:57.148571 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dxm55"] Dec 08 20:59:57 crc kubenswrapper[4781]: I1208 20:59:57.447946 4781 generic.go:334] "Generic (PLEG): container finished" podID="5d1e241c-c400-49f2-ad73-1d0c0a3652cc" containerID="e21e0f08fd333d6bf8135a3e4d2eac15f5583c88c647289b8395c29fc2f48521" exitCode=0 Dec 08 20:59:57 crc kubenswrapper[4781]: I1208 20:59:57.448025 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dxm55" event={"ID":"5d1e241c-c400-49f2-ad73-1d0c0a3652cc","Type":"ContainerDied","Data":"e21e0f08fd333d6bf8135a3e4d2eac15f5583c88c647289b8395c29fc2f48521"} Dec 08 20:59:57 crc kubenswrapper[4781]: I1208 20:59:57.448058 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dxm55" event={"ID":"5d1e241c-c400-49f2-ad73-1d0c0a3652cc","Type":"ContainerStarted","Data":"8cb30f3fb9d865b3f5b84b4395625f30740b596de8274d8998612eabc6528d79"} Dec 08 20:59:57 crc kubenswrapper[4781]: I1208 20:59:57.452165 4781 generic.go:334] "Generic (PLEG): container finished" podID="8b08acad-6664-4fd4-b958-e1fb5d34b7e9" containerID="2b485db77a4a47964409f952e958afa9d6d581e0e73003c28005ff7d446e0bda" exitCode=0 Dec 08 20:59:57 crc kubenswrapper[4781]: I1208 20:59:57.452213 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdmzv" event={"ID":"8b08acad-6664-4fd4-b958-e1fb5d34b7e9","Type":"ContainerDied","Data":"2b485db77a4a47964409f952e958afa9d6d581e0e73003c28005ff7d446e0bda"} Dec 08 20:59:58 crc kubenswrapper[4781]: I1208 20:59:58.463492 4781 generic.go:334] "Generic (PLEG): container finished" podID="5d1e241c-c400-49f2-ad73-1d0c0a3652cc" containerID="6525d62ee5faa51f80aa3a88fa7fbb0265f528abafd618e1d65a868247a94797" exitCode=0 Dec 08 20:59:58 crc kubenswrapper[4781]: I1208 20:59:58.463565 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dxm55" event={"ID":"5d1e241c-c400-49f2-ad73-1d0c0a3652cc","Type":"ContainerDied","Data":"6525d62ee5faa51f80aa3a88fa7fbb0265f528abafd618e1d65a868247a94797"} Dec 08 20:59:58 crc kubenswrapper[4781]: I1208 20:59:58.468273 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdmzv" event={"ID":"8b08acad-6664-4fd4-b958-e1fb5d34b7e9","Type":"ContainerStarted","Data":"967655d68d64c144e1f0d62c3d97ffb55614ec5f1960f4e43a6ead31c48ef55e"} Dec 08 20:59:58 crc kubenswrapper[4781]: I1208 20:59:58.504525 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qdmzv" podStartSLOduration=3.067611273 podStartE2EDuration="5.50450614s" podCreationTimestamp="2025-12-08 20:59:53 +0000 UTC" firstStartedPulling="2025-12-08 20:59:55.426867622 +0000 UTC m=+3311.578151039" lastFinishedPulling="2025-12-08 20:59:57.863762529 +0000 UTC m=+3314.015045906" observedRunningTime="2025-12-08 20:59:58.501285108 +0000 UTC m=+3314.652568485" watchObservedRunningTime="2025-12-08 20:59:58.50450614 +0000 UTC m=+3314.655789517" Dec 08 20:59:59 crc kubenswrapper[4781]: I1208 20:59:59.479100 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dxm55" event={"ID":"5d1e241c-c400-49f2-ad73-1d0c0a3652cc","Type":"ContainerStarted","Data":"597cd53d7c010d5c49f996916a254b5dee0eeeabc0ac1e22864c4b859d358b89"} Dec 08 20:59:59 crc kubenswrapper[4781]: I1208 20:59:59.499687 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dxm55" podStartSLOduration=2.046657357 podStartE2EDuration="3.499662993s" podCreationTimestamp="2025-12-08 20:59:56 +0000 UTC" firstStartedPulling="2025-12-08 20:59:57.449619117 +0000 UTC m=+3313.600902494" lastFinishedPulling="2025-12-08 20:59:58.902624763 +0000 UTC m=+3315.053908130" observedRunningTime="2025-12-08 20:59:59.495851764 +0000 UTC m=+3315.647135161" watchObservedRunningTime="2025-12-08 20:59:59.499662993 +0000 UTC m=+3315.650946380" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.151746 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2"] Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.153033 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.155134 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.155577 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.166013 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2"] Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.233730 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b384f885-0de9-499a-87c4-574c73a01f6a-config-volume\") pod \"collect-profiles-29420460-6dmj2\" (UID: \"b384f885-0de9-499a-87c4-574c73a01f6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.233813 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9ctr\" (UniqueName: \"kubernetes.io/projected/b384f885-0de9-499a-87c4-574c73a01f6a-kube-api-access-f9ctr\") pod \"collect-profiles-29420460-6dmj2\" (UID: \"b384f885-0de9-499a-87c4-574c73a01f6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.233869 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b384f885-0de9-499a-87c4-574c73a01f6a-secret-volume\") pod \"collect-profiles-29420460-6dmj2\" (UID: \"b384f885-0de9-499a-87c4-574c73a01f6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.335902 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b384f885-0de9-499a-87c4-574c73a01f6a-config-volume\") pod \"collect-profiles-29420460-6dmj2\" (UID: \"b384f885-0de9-499a-87c4-574c73a01f6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.336034 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9ctr\" (UniqueName: \"kubernetes.io/projected/b384f885-0de9-499a-87c4-574c73a01f6a-kube-api-access-f9ctr\") pod \"collect-profiles-29420460-6dmj2\" (UID: \"b384f885-0de9-499a-87c4-574c73a01f6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.336098 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b384f885-0de9-499a-87c4-574c73a01f6a-secret-volume\") pod \"collect-profiles-29420460-6dmj2\" (UID: \"b384f885-0de9-499a-87c4-574c73a01f6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.336823 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b384f885-0de9-499a-87c4-574c73a01f6a-config-volume\") pod \"collect-profiles-29420460-6dmj2\" (UID: \"b384f885-0de9-499a-87c4-574c73a01f6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.343292 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b384f885-0de9-499a-87c4-574c73a01f6a-secret-volume\") pod \"collect-profiles-29420460-6dmj2\" (UID: \"b384f885-0de9-499a-87c4-574c73a01f6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.356076 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9ctr\" (UniqueName: \"kubernetes.io/projected/b384f885-0de9-499a-87c4-574c73a01f6a-kube-api-access-f9ctr\") pod \"collect-profiles-29420460-6dmj2\" (UID: \"b384f885-0de9-499a-87c4-574c73a01f6a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.478751 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" Dec 08 21:00:00 crc kubenswrapper[4781]: I1208 21:00:00.931103 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2"] Dec 08 21:00:01 crc kubenswrapper[4781]: I1208 21:00:01.501873 4781 generic.go:334] "Generic (PLEG): container finished" podID="b384f885-0de9-499a-87c4-574c73a01f6a" containerID="c5d9e40811a76724b825ae7f7dbe5a9e9d66426f0233e351ff2edc3da0d0a192" exitCode=0 Dec 08 21:00:01 crc kubenswrapper[4781]: I1208 21:00:01.501999 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" event={"ID":"b384f885-0de9-499a-87c4-574c73a01f6a","Type":"ContainerDied","Data":"c5d9e40811a76724b825ae7f7dbe5a9e9d66426f0233e351ff2edc3da0d0a192"} Dec 08 21:00:01 crc kubenswrapper[4781]: I1208 21:00:01.502253 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" event={"ID":"b384f885-0de9-499a-87c4-574c73a01f6a","Type":"ContainerStarted","Data":"40cedbd7575a1bfe6be57a8b83b7c45b1fbfdf5fe1698517aa9259076ff6d6eb"} Dec 08 21:00:02 crc kubenswrapper[4781]: I1208 21:00:02.906743 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" Dec 08 21:00:03 crc kubenswrapper[4781]: I1208 21:00:03.147305 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b384f885-0de9-499a-87c4-574c73a01f6a-secret-volume\") pod \"b384f885-0de9-499a-87c4-574c73a01f6a\" (UID: \"b384f885-0de9-499a-87c4-574c73a01f6a\") " Dec 08 21:00:03 crc kubenswrapper[4781]: I1208 21:00:03.147435 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b384f885-0de9-499a-87c4-574c73a01f6a-config-volume\") pod \"b384f885-0de9-499a-87c4-574c73a01f6a\" (UID: \"b384f885-0de9-499a-87c4-574c73a01f6a\") " Dec 08 21:00:03 crc kubenswrapper[4781]: I1208 21:00:03.148163 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9ctr\" (UniqueName: \"kubernetes.io/projected/b384f885-0de9-499a-87c4-574c73a01f6a-kube-api-access-f9ctr\") pod \"b384f885-0de9-499a-87c4-574c73a01f6a\" (UID: \"b384f885-0de9-499a-87c4-574c73a01f6a\") " Dec 08 21:00:03 crc kubenswrapper[4781]: I1208 21:00:03.148304 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b384f885-0de9-499a-87c4-574c73a01f6a-config-volume" (OuterVolumeSpecName: "config-volume") pod "b384f885-0de9-499a-87c4-574c73a01f6a" (UID: "b384f885-0de9-499a-87c4-574c73a01f6a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 21:00:03 crc kubenswrapper[4781]: I1208 21:00:03.148361 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 21:00:03 crc kubenswrapper[4781]: E1208 21:00:03.148671 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:00:03 crc kubenswrapper[4781]: I1208 21:00:03.149789 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b384f885-0de9-499a-87c4-574c73a01f6a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 21:00:03 crc kubenswrapper[4781]: I1208 21:00:03.153019 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b384f885-0de9-499a-87c4-574c73a01f6a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b384f885-0de9-499a-87c4-574c73a01f6a" (UID: "b384f885-0de9-499a-87c4-574c73a01f6a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 21:00:03 crc kubenswrapper[4781]: I1208 21:00:03.159612 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b384f885-0de9-499a-87c4-574c73a01f6a-kube-api-access-f9ctr" (OuterVolumeSpecName: "kube-api-access-f9ctr") pod "b384f885-0de9-499a-87c4-574c73a01f6a" (UID: "b384f885-0de9-499a-87c4-574c73a01f6a"). InnerVolumeSpecName "kube-api-access-f9ctr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:00:03 crc kubenswrapper[4781]: I1208 21:00:03.252330 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b384f885-0de9-499a-87c4-574c73a01f6a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 21:00:03 crc kubenswrapper[4781]: I1208 21:00:03.252362 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9ctr\" (UniqueName: \"kubernetes.io/projected/b384f885-0de9-499a-87c4-574c73a01f6a-kube-api-access-f9ctr\") on node \"crc\" DevicePath \"\"" Dec 08 21:00:03 crc kubenswrapper[4781]: I1208 21:00:03.539472 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" event={"ID":"b384f885-0de9-499a-87c4-574c73a01f6a","Type":"ContainerDied","Data":"40cedbd7575a1bfe6be57a8b83b7c45b1fbfdf5fe1698517aa9259076ff6d6eb"} Dec 08 21:00:03 crc kubenswrapper[4781]: I1208 21:00:03.539524 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40cedbd7575a1bfe6be57a8b83b7c45b1fbfdf5fe1698517aa9259076ff6d6eb" Dec 08 21:00:03 crc kubenswrapper[4781]: I1208 21:00:03.539536 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420460-6dmj2" Dec 08 21:00:04 crc kubenswrapper[4781]: I1208 21:00:04.057027 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs"] Dec 08 21:00:04 crc kubenswrapper[4781]: I1208 21:00:04.068543 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420415-np6zs"] Dec 08 21:00:04 crc kubenswrapper[4781]: I1208 21:00:04.138963 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d088452a-eba7-4ada-a1d1-312f3471960b" path="/var/lib/kubelet/pods/d088452a-eba7-4ada-a1d1-312f3471960b/volumes" Dec 08 21:00:04 crc kubenswrapper[4781]: I1208 21:00:04.288364 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qdmzv" Dec 08 21:00:04 crc kubenswrapper[4781]: I1208 21:00:04.288431 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qdmzv" Dec 08 21:00:04 crc kubenswrapper[4781]: I1208 21:00:04.335767 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qdmzv" Dec 08 21:00:04 crc kubenswrapper[4781]: I1208 21:00:04.596895 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qdmzv" Dec 08 21:00:06 crc kubenswrapper[4781]: I1208 21:00:06.651344 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 21:00:06 crc kubenswrapper[4781]: I1208 21:00:06.651844 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 21:00:06 crc kubenswrapper[4781]: I1208 21:00:06.699209 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 21:00:07 crc kubenswrapper[4781]: I1208 21:00:07.631066 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 21:00:08 crc kubenswrapper[4781]: I1208 21:00:08.373461 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qdmzv"] Dec 08 21:00:08 crc kubenswrapper[4781]: I1208 21:00:08.374062 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qdmzv" podUID="8b08acad-6664-4fd4-b958-e1fb5d34b7e9" containerName="registry-server" containerID="cri-o://967655d68d64c144e1f0d62c3d97ffb55614ec5f1960f4e43a6ead31c48ef55e" gracePeriod=2 Dec 08 21:00:08 crc kubenswrapper[4781]: I1208 21:00:08.591833 4781 generic.go:334] "Generic (PLEG): container finished" podID="8b08acad-6664-4fd4-b958-e1fb5d34b7e9" containerID="967655d68d64c144e1f0d62c3d97ffb55614ec5f1960f4e43a6ead31c48ef55e" exitCode=0 Dec 08 21:00:08 crc kubenswrapper[4781]: I1208 21:00:08.591939 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdmzv" event={"ID":"8b08acad-6664-4fd4-b958-e1fb5d34b7e9","Type":"ContainerDied","Data":"967655d68d64c144e1f0d62c3d97ffb55614ec5f1960f4e43a6ead31c48ef55e"} Dec 08 21:00:08 crc kubenswrapper[4781]: I1208 21:00:08.881021 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdmzv" Dec 08 21:00:08 crc kubenswrapper[4781]: I1208 21:00:08.945870 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bzm9\" (UniqueName: \"kubernetes.io/projected/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-kube-api-access-7bzm9\") pod \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\" (UID: \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\") " Dec 08 21:00:08 crc kubenswrapper[4781]: I1208 21:00:08.945998 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-utilities\") pod \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\" (UID: \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\") " Dec 08 21:00:08 crc kubenswrapper[4781]: I1208 21:00:08.946027 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-catalog-content\") pod \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\" (UID: \"8b08acad-6664-4fd4-b958-e1fb5d34b7e9\") " Dec 08 21:00:08 crc kubenswrapper[4781]: I1208 21:00:08.946840 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-utilities" (OuterVolumeSpecName: "utilities") pod "8b08acad-6664-4fd4-b958-e1fb5d34b7e9" (UID: "8b08acad-6664-4fd4-b958-e1fb5d34b7e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:00:08 crc kubenswrapper[4781]: I1208 21:00:08.950989 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-kube-api-access-7bzm9" (OuterVolumeSpecName: "kube-api-access-7bzm9") pod "8b08acad-6664-4fd4-b958-e1fb5d34b7e9" (UID: "8b08acad-6664-4fd4-b958-e1fb5d34b7e9"). InnerVolumeSpecName "kube-api-access-7bzm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:00:09 crc kubenswrapper[4781]: I1208 21:00:09.004622 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b08acad-6664-4fd4-b958-e1fb5d34b7e9" (UID: "8b08acad-6664-4fd4-b958-e1fb5d34b7e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:00:09 crc kubenswrapper[4781]: I1208 21:00:09.048336 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bzm9\" (UniqueName: \"kubernetes.io/projected/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-kube-api-access-7bzm9\") on node \"crc\" DevicePath \"\"" Dec 08 21:00:09 crc kubenswrapper[4781]: I1208 21:00:09.048382 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 21:00:09 crc kubenswrapper[4781]: I1208 21:00:09.048396 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b08acad-6664-4fd4-b958-e1fb5d34b7e9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 21:00:09 crc kubenswrapper[4781]: I1208 21:00:09.635531 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdmzv" event={"ID":"8b08acad-6664-4fd4-b958-e1fb5d34b7e9","Type":"ContainerDied","Data":"253279ac1fb5010c2ea8edf01f7ad21eeb59c27dcc567e36ddc0cb1c3f1d12f7"} Dec 08 21:00:09 crc kubenswrapper[4781]: I1208 21:00:09.635806 4781 scope.go:117] "RemoveContainer" containerID="967655d68d64c144e1f0d62c3d97ffb55614ec5f1960f4e43a6ead31c48ef55e" Dec 08 21:00:09 crc kubenswrapper[4781]: I1208 21:00:09.635869 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdmzv" Dec 08 21:00:09 crc kubenswrapper[4781]: I1208 21:00:09.674651 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qdmzv"] Dec 08 21:00:09 crc kubenswrapper[4781]: I1208 21:00:09.683538 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qdmzv"] Dec 08 21:00:09 crc kubenswrapper[4781]: I1208 21:00:09.686048 4781 scope.go:117] "RemoveContainer" containerID="2b485db77a4a47964409f952e958afa9d6d581e0e73003c28005ff7d446e0bda" Dec 08 21:00:09 crc kubenswrapper[4781]: I1208 21:00:09.710718 4781 scope.go:117] "RemoveContainer" containerID="3f340a8d155231c8ecf4631ab1c2700c96131efa627199e11eaceff95ec50b55" Dec 08 21:00:10 crc kubenswrapper[4781]: I1208 21:00:10.135843 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b08acad-6664-4fd4-b958-e1fb5d34b7e9" path="/var/lib/kubelet/pods/8b08acad-6664-4fd4-b958-e1fb5d34b7e9/volumes" Dec 08 21:00:10 crc kubenswrapper[4781]: I1208 21:00:10.570635 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dxm55"] Dec 08 21:00:10 crc kubenswrapper[4781]: I1208 21:00:10.570870 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dxm55" podUID="5d1e241c-c400-49f2-ad73-1d0c0a3652cc" containerName="registry-server" containerID="cri-o://597cd53d7c010d5c49f996916a254b5dee0eeeabc0ac1e22864c4b859d358b89" gracePeriod=2 Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.347370 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.530456 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2gkz\" (UniqueName: \"kubernetes.io/projected/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-kube-api-access-c2gkz\") pod \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\" (UID: \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\") " Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.531128 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-utilities\") pod \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\" (UID: \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\") " Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.531193 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-catalog-content\") pod \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\" (UID: \"5d1e241c-c400-49f2-ad73-1d0c0a3652cc\") " Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.532153 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-utilities" (OuterVolumeSpecName: "utilities") pod "5d1e241c-c400-49f2-ad73-1d0c0a3652cc" (UID: "5d1e241c-c400-49f2-ad73-1d0c0a3652cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.540656 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-kube-api-access-c2gkz" (OuterVolumeSpecName: "kube-api-access-c2gkz") pod "5d1e241c-c400-49f2-ad73-1d0c0a3652cc" (UID: "5d1e241c-c400-49f2-ad73-1d0c0a3652cc"). InnerVolumeSpecName "kube-api-access-c2gkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.561278 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d1e241c-c400-49f2-ad73-1d0c0a3652cc" (UID: "5d1e241c-c400-49f2-ad73-1d0c0a3652cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.633304 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.633350 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.633364 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2gkz\" (UniqueName: \"kubernetes.io/projected/5d1e241c-c400-49f2-ad73-1d0c0a3652cc-kube-api-access-c2gkz\") on node \"crc\" DevicePath \"\"" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.660369 4781 generic.go:334] "Generic (PLEG): container finished" podID="5d1e241c-c400-49f2-ad73-1d0c0a3652cc" containerID="597cd53d7c010d5c49f996916a254b5dee0eeeabc0ac1e22864c4b859d358b89" exitCode=0 Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.660422 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dxm55" event={"ID":"5d1e241c-c400-49f2-ad73-1d0c0a3652cc","Type":"ContainerDied","Data":"597cd53d7c010d5c49f996916a254b5dee0eeeabc0ac1e22864c4b859d358b89"} Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.660450 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dxm55" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.660491 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dxm55" event={"ID":"5d1e241c-c400-49f2-ad73-1d0c0a3652cc","Type":"ContainerDied","Data":"8cb30f3fb9d865b3f5b84b4395625f30740b596de8274d8998612eabc6528d79"} Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.660509 4781 scope.go:117] "RemoveContainer" containerID="597cd53d7c010d5c49f996916a254b5dee0eeeabc0ac1e22864c4b859d358b89" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.693007 4781 scope.go:117] "RemoveContainer" containerID="6525d62ee5faa51f80aa3a88fa7fbb0265f528abafd618e1d65a868247a94797" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.700824 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dxm55"] Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.711704 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dxm55"] Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.712789 4781 scope.go:117] "RemoveContainer" containerID="e21e0f08fd333d6bf8135a3e4d2eac15f5583c88c647289b8395c29fc2f48521" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.764207 4781 scope.go:117] "RemoveContainer" containerID="597cd53d7c010d5c49f996916a254b5dee0eeeabc0ac1e22864c4b859d358b89" Dec 08 21:00:11 crc kubenswrapper[4781]: E1208 21:00:11.765271 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"597cd53d7c010d5c49f996916a254b5dee0eeeabc0ac1e22864c4b859d358b89\": container with ID starting with 597cd53d7c010d5c49f996916a254b5dee0eeeabc0ac1e22864c4b859d358b89 not found: ID does not exist" containerID="597cd53d7c010d5c49f996916a254b5dee0eeeabc0ac1e22864c4b859d358b89" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.765403 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"597cd53d7c010d5c49f996916a254b5dee0eeeabc0ac1e22864c4b859d358b89"} err="failed to get container status \"597cd53d7c010d5c49f996916a254b5dee0eeeabc0ac1e22864c4b859d358b89\": rpc error: code = NotFound desc = could not find container \"597cd53d7c010d5c49f996916a254b5dee0eeeabc0ac1e22864c4b859d358b89\": container with ID starting with 597cd53d7c010d5c49f996916a254b5dee0eeeabc0ac1e22864c4b859d358b89 not found: ID does not exist" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.765519 4781 scope.go:117] "RemoveContainer" containerID="6525d62ee5faa51f80aa3a88fa7fbb0265f528abafd618e1d65a868247a94797" Dec 08 21:00:11 crc kubenswrapper[4781]: E1208 21:00:11.765862 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6525d62ee5faa51f80aa3a88fa7fbb0265f528abafd618e1d65a868247a94797\": container with ID starting with 6525d62ee5faa51f80aa3a88fa7fbb0265f528abafd618e1d65a868247a94797 not found: ID does not exist" containerID="6525d62ee5faa51f80aa3a88fa7fbb0265f528abafd618e1d65a868247a94797" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.765889 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6525d62ee5faa51f80aa3a88fa7fbb0265f528abafd618e1d65a868247a94797"} err="failed to get container status \"6525d62ee5faa51f80aa3a88fa7fbb0265f528abafd618e1d65a868247a94797\": rpc error: code = NotFound desc = could not find container \"6525d62ee5faa51f80aa3a88fa7fbb0265f528abafd618e1d65a868247a94797\": container with ID starting with 6525d62ee5faa51f80aa3a88fa7fbb0265f528abafd618e1d65a868247a94797 not found: ID does not exist" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.765907 4781 scope.go:117] "RemoveContainer" containerID="e21e0f08fd333d6bf8135a3e4d2eac15f5583c88c647289b8395c29fc2f48521" Dec 08 21:00:11 crc kubenswrapper[4781]: E1208 21:00:11.766278 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e21e0f08fd333d6bf8135a3e4d2eac15f5583c88c647289b8395c29fc2f48521\": container with ID starting with e21e0f08fd333d6bf8135a3e4d2eac15f5583c88c647289b8395c29fc2f48521 not found: ID does not exist" containerID="e21e0f08fd333d6bf8135a3e4d2eac15f5583c88c647289b8395c29fc2f48521" Dec 08 21:00:11 crc kubenswrapper[4781]: I1208 21:00:11.766312 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e21e0f08fd333d6bf8135a3e4d2eac15f5583c88c647289b8395c29fc2f48521"} err="failed to get container status \"e21e0f08fd333d6bf8135a3e4d2eac15f5583c88c647289b8395c29fc2f48521\": rpc error: code = NotFound desc = could not find container \"e21e0f08fd333d6bf8135a3e4d2eac15f5583c88c647289b8395c29fc2f48521\": container with ID starting with e21e0f08fd333d6bf8135a3e4d2eac15f5583c88c647289b8395c29fc2f48521 not found: ID does not exist" Dec 08 21:00:12 crc kubenswrapper[4781]: I1208 21:00:12.137507 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1e241c-c400-49f2-ad73-1d0c0a3652cc" path="/var/lib/kubelet/pods/5d1e241c-c400-49f2-ad73-1d0c0a3652cc/volumes" Dec 08 21:00:14 crc kubenswrapper[4781]: I1208 21:00:14.158490 4781 scope.go:117] "RemoveContainer" containerID="024f33453b3f33f566d3d78ff09833f46fe509a1c6ee98970c86f0340f4ee7b0" Dec 08 21:00:16 crc kubenswrapper[4781]: I1208 21:00:16.125764 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 21:00:16 crc kubenswrapper[4781]: E1208 21:00:16.126501 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:00:28 crc kubenswrapper[4781]: I1208 21:00:28.125885 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 21:00:28 crc kubenswrapper[4781]: E1208 21:00:28.126733 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:00:43 crc kubenswrapper[4781]: I1208 21:00:43.126395 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 21:00:43 crc kubenswrapper[4781]: E1208 21:00:43.127492 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:00:54 crc kubenswrapper[4781]: I1208 21:00:54.133470 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 21:00:54 crc kubenswrapper[4781]: E1208 21:00:54.134335 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.154339 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29420461-q4qhq"] Dec 08 21:01:00 crc kubenswrapper[4781]: E1208 21:01:00.155430 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b08acad-6664-4fd4-b958-e1fb5d34b7e9" containerName="extract-utilities" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.155449 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b08acad-6664-4fd4-b958-e1fb5d34b7e9" containerName="extract-utilities" Dec 08 21:01:00 crc kubenswrapper[4781]: E1208 21:01:00.155464 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1e241c-c400-49f2-ad73-1d0c0a3652cc" containerName="registry-server" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.155475 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1e241c-c400-49f2-ad73-1d0c0a3652cc" containerName="registry-server" Dec 08 21:01:00 crc kubenswrapper[4781]: E1208 21:01:00.155495 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1e241c-c400-49f2-ad73-1d0c0a3652cc" containerName="extract-content" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.155503 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1e241c-c400-49f2-ad73-1d0c0a3652cc" containerName="extract-content" Dec 08 21:01:00 crc kubenswrapper[4781]: E1208 21:01:00.155516 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b08acad-6664-4fd4-b958-e1fb5d34b7e9" containerName="registry-server" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.155522 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b08acad-6664-4fd4-b958-e1fb5d34b7e9" containerName="registry-server" Dec 08 21:01:00 crc kubenswrapper[4781]: E1208 21:01:00.155558 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b384f885-0de9-499a-87c4-574c73a01f6a" containerName="collect-profiles" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.155565 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="b384f885-0de9-499a-87c4-574c73a01f6a" containerName="collect-profiles" Dec 08 21:01:00 crc kubenswrapper[4781]: E1208 21:01:00.155580 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b08acad-6664-4fd4-b958-e1fb5d34b7e9" containerName="extract-content" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.155587 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b08acad-6664-4fd4-b958-e1fb5d34b7e9" containerName="extract-content" Dec 08 21:01:00 crc kubenswrapper[4781]: E1208 21:01:00.155598 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1e241c-c400-49f2-ad73-1d0c0a3652cc" containerName="extract-utilities" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.155605 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1e241c-c400-49f2-ad73-1d0c0a3652cc" containerName="extract-utilities" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.155797 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="b384f885-0de9-499a-87c4-574c73a01f6a" containerName="collect-profiles" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.155817 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d1e241c-c400-49f2-ad73-1d0c0a3652cc" containerName="registry-server" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.155829 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b08acad-6664-4fd4-b958-e1fb5d34b7e9" containerName="registry-server" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.156633 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.166579 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29420461-q4qhq"] Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.257975 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9gpm\" (UniqueName: \"kubernetes.io/projected/84bf4f32-efbb-4625-9863-20c2b169937a-kube-api-access-n9gpm\") pod \"keystone-cron-29420461-q4qhq\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.258033 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-combined-ca-bundle\") pod \"keystone-cron-29420461-q4qhq\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.258063 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-fernet-keys\") pod \"keystone-cron-29420461-q4qhq\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.258153 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-config-data\") pod \"keystone-cron-29420461-q4qhq\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.360078 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9gpm\" (UniqueName: \"kubernetes.io/projected/84bf4f32-efbb-4625-9863-20c2b169937a-kube-api-access-n9gpm\") pod \"keystone-cron-29420461-q4qhq\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.360340 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-combined-ca-bundle\") pod \"keystone-cron-29420461-q4qhq\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.360372 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-fernet-keys\") pod \"keystone-cron-29420461-q4qhq\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.360442 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-config-data\") pod \"keystone-cron-29420461-q4qhq\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.366380 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-config-data\") pod \"keystone-cron-29420461-q4qhq\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.366400 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-combined-ca-bundle\") pod \"keystone-cron-29420461-q4qhq\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.367538 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-fernet-keys\") pod \"keystone-cron-29420461-q4qhq\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.382488 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9gpm\" (UniqueName: \"kubernetes.io/projected/84bf4f32-efbb-4625-9863-20c2b169937a-kube-api-access-n9gpm\") pod \"keystone-cron-29420461-q4qhq\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.484422 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:00 crc kubenswrapper[4781]: I1208 21:01:00.941699 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29420461-q4qhq"] Dec 08 21:01:01 crc kubenswrapper[4781]: I1208 21:01:01.106998 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29420461-q4qhq" event={"ID":"84bf4f32-efbb-4625-9863-20c2b169937a","Type":"ContainerStarted","Data":"7c956eaf5f6beb8f4af2cacaf38ea8e7c1e8975e46f729725c1b53a169e5b7e1"} Dec 08 21:01:02 crc kubenswrapper[4781]: I1208 21:01:02.117264 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29420461-q4qhq" event={"ID":"84bf4f32-efbb-4625-9863-20c2b169937a","Type":"ContainerStarted","Data":"e6ae356ea6706a1a1eb84af462fad734865fbab11b6e840217467ceb072e8cb1"} Dec 08 21:01:02 crc kubenswrapper[4781]: I1208 21:01:02.143710 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29420461-q4qhq" podStartSLOduration=2.143690687 podStartE2EDuration="2.143690687s" podCreationTimestamp="2025-12-08 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 21:01:02.13924365 +0000 UTC m=+3378.290527027" watchObservedRunningTime="2025-12-08 21:01:02.143690687 +0000 UTC m=+3378.294974054" Dec 08 21:01:04 crc kubenswrapper[4781]: I1208 21:01:04.137343 4781 generic.go:334] "Generic (PLEG): container finished" podID="84bf4f32-efbb-4625-9863-20c2b169937a" containerID="e6ae356ea6706a1a1eb84af462fad734865fbab11b6e840217467ceb072e8cb1" exitCode=0 Dec 08 21:01:04 crc kubenswrapper[4781]: I1208 21:01:04.137430 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29420461-q4qhq" event={"ID":"84bf4f32-efbb-4625-9863-20c2b169937a","Type":"ContainerDied","Data":"e6ae356ea6706a1a1eb84af462fad734865fbab11b6e840217467ceb072e8cb1"} Dec 08 21:01:05 crc kubenswrapper[4781]: I1208 21:01:05.602785 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:05 crc kubenswrapper[4781]: I1208 21:01:05.764968 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-combined-ca-bundle\") pod \"84bf4f32-efbb-4625-9863-20c2b169937a\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " Dec 08 21:01:05 crc kubenswrapper[4781]: I1208 21:01:05.765040 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-config-data\") pod \"84bf4f32-efbb-4625-9863-20c2b169937a\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " Dec 08 21:01:05 crc kubenswrapper[4781]: I1208 21:01:05.765124 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9gpm\" (UniqueName: \"kubernetes.io/projected/84bf4f32-efbb-4625-9863-20c2b169937a-kube-api-access-n9gpm\") pod \"84bf4f32-efbb-4625-9863-20c2b169937a\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " Dec 08 21:01:05 crc kubenswrapper[4781]: I1208 21:01:05.765150 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-fernet-keys\") pod \"84bf4f32-efbb-4625-9863-20c2b169937a\" (UID: \"84bf4f32-efbb-4625-9863-20c2b169937a\") " Dec 08 21:01:05 crc kubenswrapper[4781]: I1208 21:01:05.770386 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "84bf4f32-efbb-4625-9863-20c2b169937a" (UID: "84bf4f32-efbb-4625-9863-20c2b169937a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 21:01:05 crc kubenswrapper[4781]: I1208 21:01:05.780019 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84bf4f32-efbb-4625-9863-20c2b169937a-kube-api-access-n9gpm" (OuterVolumeSpecName: "kube-api-access-n9gpm") pod "84bf4f32-efbb-4625-9863-20c2b169937a" (UID: "84bf4f32-efbb-4625-9863-20c2b169937a"). InnerVolumeSpecName "kube-api-access-n9gpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:01:05 crc kubenswrapper[4781]: I1208 21:01:05.796600 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84bf4f32-efbb-4625-9863-20c2b169937a" (UID: "84bf4f32-efbb-4625-9863-20c2b169937a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 21:01:05 crc kubenswrapper[4781]: I1208 21:01:05.821154 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-config-data" (OuterVolumeSpecName: "config-data") pod "84bf4f32-efbb-4625-9863-20c2b169937a" (UID: "84bf4f32-efbb-4625-9863-20c2b169937a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 21:01:05 crc kubenswrapper[4781]: I1208 21:01:05.868147 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9gpm\" (UniqueName: \"kubernetes.io/projected/84bf4f32-efbb-4625-9863-20c2b169937a-kube-api-access-n9gpm\") on node \"crc\" DevicePath \"\"" Dec 08 21:01:05 crc kubenswrapper[4781]: I1208 21:01:05.868185 4781 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 08 21:01:05 crc kubenswrapper[4781]: I1208 21:01:05.868197 4781 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 08 21:01:05 crc kubenswrapper[4781]: I1208 21:01:05.868240 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84bf4f32-efbb-4625-9863-20c2b169937a-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 21:01:06 crc kubenswrapper[4781]: I1208 21:01:06.157781 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29420461-q4qhq" event={"ID":"84bf4f32-efbb-4625-9863-20c2b169937a","Type":"ContainerDied","Data":"7c956eaf5f6beb8f4af2cacaf38ea8e7c1e8975e46f729725c1b53a169e5b7e1"} Dec 08 21:01:06 crc kubenswrapper[4781]: I1208 21:01:06.157820 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c956eaf5f6beb8f4af2cacaf38ea8e7c1e8975e46f729725c1b53a169e5b7e1" Dec 08 21:01:06 crc kubenswrapper[4781]: I1208 21:01:06.157839 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29420461-q4qhq" Dec 08 21:01:09 crc kubenswrapper[4781]: I1208 21:01:09.125723 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 21:01:09 crc kubenswrapper[4781]: E1208 21:01:09.126480 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:01:21 crc kubenswrapper[4781]: I1208 21:01:21.126268 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 21:01:21 crc kubenswrapper[4781]: E1208 21:01:21.127147 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:01:22 crc kubenswrapper[4781]: I1208 21:01:22.311590 4781 generic.go:334] "Generic (PLEG): container finished" podID="e2b97db1-1e2a-45e9-b959-fde154131c3b" containerID="35e14ec4dbbbc04bf381fc6644d4dab25b52641610e2c885f5d121397e5cbdd9" exitCode=0 Dec 08 21:01:22 crc kubenswrapper[4781]: I1208 21:01:22.311660 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e2b97db1-1e2a-45e9-b959-fde154131c3b","Type":"ContainerDied","Data":"35e14ec4dbbbc04bf381fc6644d4dab25b52641610e2c885f5d121397e5cbdd9"} Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.673298 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.864064 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e2b97db1-1e2a-45e9-b959-fde154131c3b-test-operator-ephemeral-temporary\") pod \"e2b97db1-1e2a-45e9-b959-fde154131c3b\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.864116 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e2b97db1-1e2a-45e9-b959-fde154131c3b-openstack-config\") pod \"e2b97db1-1e2a-45e9-b959-fde154131c3b\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.864176 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsq6g\" (UniqueName: \"kubernetes.io/projected/e2b97db1-1e2a-45e9-b959-fde154131c3b-kube-api-access-lsq6g\") pod \"e2b97db1-1e2a-45e9-b959-fde154131c3b\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.864316 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-ssh-key\") pod \"e2b97db1-1e2a-45e9-b959-fde154131c3b\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.864374 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2b97db1-1e2a-45e9-b959-fde154131c3b-config-data\") pod \"e2b97db1-1e2a-45e9-b959-fde154131c3b\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.864419 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"e2b97db1-1e2a-45e9-b959-fde154131c3b\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.864466 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-openstack-config-secret\") pod \"e2b97db1-1e2a-45e9-b959-fde154131c3b\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.864515 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-ca-certs\") pod \"e2b97db1-1e2a-45e9-b959-fde154131c3b\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.864558 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e2b97db1-1e2a-45e9-b959-fde154131c3b-test-operator-ephemeral-workdir\") pod \"e2b97db1-1e2a-45e9-b959-fde154131c3b\" (UID: \"e2b97db1-1e2a-45e9-b959-fde154131c3b\") " Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.864909 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2b97db1-1e2a-45e9-b959-fde154131c3b-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "e2b97db1-1e2a-45e9-b959-fde154131c3b" (UID: "e2b97db1-1e2a-45e9-b959-fde154131c3b"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.865103 4781 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e2b97db1-1e2a-45e9-b959-fde154131c3b-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.866226 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b97db1-1e2a-45e9-b959-fde154131c3b-config-data" (OuterVolumeSpecName: "config-data") pod "e2b97db1-1e2a-45e9-b959-fde154131c3b" (UID: "e2b97db1-1e2a-45e9-b959-fde154131c3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.877412 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2b97db1-1e2a-45e9-b959-fde154131c3b-kube-api-access-lsq6g" (OuterVolumeSpecName: "kube-api-access-lsq6g") pod "e2b97db1-1e2a-45e9-b959-fde154131c3b" (UID: "e2b97db1-1e2a-45e9-b959-fde154131c3b"). InnerVolumeSpecName "kube-api-access-lsq6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.878328 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "e2b97db1-1e2a-45e9-b959-fde154131c3b" (UID: "e2b97db1-1e2a-45e9-b959-fde154131c3b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.895026 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2b97db1-1e2a-45e9-b959-fde154131c3b-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "e2b97db1-1e2a-45e9-b959-fde154131c3b" (UID: "e2b97db1-1e2a-45e9-b959-fde154131c3b"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.898450 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "e2b97db1-1e2a-45e9-b959-fde154131c3b" (UID: "e2b97db1-1e2a-45e9-b959-fde154131c3b"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.904670 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e2b97db1-1e2a-45e9-b959-fde154131c3b" (UID: "e2b97db1-1e2a-45e9-b959-fde154131c3b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.910865 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e2b97db1-1e2a-45e9-b959-fde154131c3b" (UID: "e2b97db1-1e2a-45e9-b959-fde154131c3b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.927373 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b97db1-1e2a-45e9-b959-fde154131c3b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e2b97db1-1e2a-45e9-b959-fde154131c3b" (UID: "e2b97db1-1e2a-45e9-b959-fde154131c3b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.966320 4781 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2b97db1-1e2a-45e9-b959-fde154131c3b-config-data\") on node \"crc\" DevicePath \"\"" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.966400 4781 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.966416 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.966428 4781 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.966438 4781 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e2b97db1-1e2a-45e9-b959-fde154131c3b-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.966448 4781 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e2b97db1-1e2a-45e9-b959-fde154131c3b-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.966458 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsq6g\" (UniqueName: \"kubernetes.io/projected/e2b97db1-1e2a-45e9-b959-fde154131c3b-kube-api-access-lsq6g\") on node \"crc\" DevicePath \"\"" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.966467 4781 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b97db1-1e2a-45e9-b959-fde154131c3b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 08 21:01:23 crc kubenswrapper[4781]: I1208 21:01:23.988148 4781 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 08 21:01:24 crc kubenswrapper[4781]: I1208 21:01:24.068207 4781 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 08 21:01:24 crc kubenswrapper[4781]: I1208 21:01:24.333868 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e2b97db1-1e2a-45e9-b959-fde154131c3b","Type":"ContainerDied","Data":"f91dd50f1260b239b8df5c85be4416622ea1af2e361485df642b46941feddc8d"} Dec 08 21:01:24 crc kubenswrapper[4781]: I1208 21:01:24.333936 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f91dd50f1260b239b8df5c85be4416622ea1af2e361485df642b46941feddc8d" Dec 08 21:01:24 crc kubenswrapper[4781]: I1208 21:01:24.334012 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.070248 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 08 21:01:26 crc kubenswrapper[4781]: E1208 21:01:26.071077 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b97db1-1e2a-45e9-b959-fde154131c3b" containerName="tempest-tests-tempest-tests-runner" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.071094 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b97db1-1e2a-45e9-b959-fde154131c3b" containerName="tempest-tests-tempest-tests-runner" Dec 08 21:01:26 crc kubenswrapper[4781]: E1208 21:01:26.071135 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bf4f32-efbb-4625-9863-20c2b169937a" containerName="keystone-cron" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.071148 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bf4f32-efbb-4625-9863-20c2b169937a" containerName="keystone-cron" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.071383 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b97db1-1e2a-45e9-b959-fde154131c3b" containerName="tempest-tests-tempest-tests-runner" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.071410 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="84bf4f32-efbb-4625-9863-20c2b169937a" containerName="keystone-cron" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.072201 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.076276 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-24hrz" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.088085 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.207812 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr7t8\" (UniqueName: \"kubernetes.io/projected/71062a6c-6abb-416f-9c98-a98fb20ad2ad-kube-api-access-nr7t8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"71062a6c-6abb-416f-9c98-a98fb20ad2ad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.207941 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"71062a6c-6abb-416f-9c98-a98fb20ad2ad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.310040 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"71062a6c-6abb-416f-9c98-a98fb20ad2ad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.310353 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr7t8\" (UniqueName: \"kubernetes.io/projected/71062a6c-6abb-416f-9c98-a98fb20ad2ad-kube-api-access-nr7t8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"71062a6c-6abb-416f-9c98-a98fb20ad2ad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.310761 4781 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"71062a6c-6abb-416f-9c98-a98fb20ad2ad\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.338497 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr7t8\" (UniqueName: \"kubernetes.io/projected/71062a6c-6abb-416f-9c98-a98fb20ad2ad-kube-api-access-nr7t8\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"71062a6c-6abb-416f-9c98-a98fb20ad2ad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.352980 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"71062a6c-6abb-416f-9c98-a98fb20ad2ad\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.398366 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 08 21:01:26 crc kubenswrapper[4781]: I1208 21:01:26.881194 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 08 21:01:27 crc kubenswrapper[4781]: I1208 21:01:27.368771 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"71062a6c-6abb-416f-9c98-a98fb20ad2ad","Type":"ContainerStarted","Data":"9454612db72646d90506fed845ff5284c1ae69f3a139e73393377293b9d81466"} Dec 08 21:01:28 crc kubenswrapper[4781]: I1208 21:01:28.380559 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"71062a6c-6abb-416f-9c98-a98fb20ad2ad","Type":"ContainerStarted","Data":"bf8fd804a9b05e9b38d18bea450571cf7603783848a2ed5cd22e75375d0e1184"} Dec 08 21:01:28 crc kubenswrapper[4781]: I1208 21:01:28.402703 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.557258925 podStartE2EDuration="2.402671249s" podCreationTimestamp="2025-12-08 21:01:26 +0000 UTC" firstStartedPulling="2025-12-08 21:01:26.887727109 +0000 UTC m=+3403.039010486" lastFinishedPulling="2025-12-08 21:01:27.733139433 +0000 UTC m=+3403.884422810" observedRunningTime="2025-12-08 21:01:28.395611236 +0000 UTC m=+3404.546894613" watchObservedRunningTime="2025-12-08 21:01:28.402671249 +0000 UTC m=+3404.553954636" Dec 08 21:01:36 crc kubenswrapper[4781]: I1208 21:01:36.130381 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 21:01:36 crc kubenswrapper[4781]: E1208 21:01:36.131117 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:01:49 crc kubenswrapper[4781]: I1208 21:01:49.126427 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 21:01:49 crc kubenswrapper[4781]: E1208 21:01:49.127891 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:01:52 crc kubenswrapper[4781]: I1208 21:01:52.815409 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2k9gc/must-gather-8drg4"] Dec 08 21:01:52 crc kubenswrapper[4781]: I1208 21:01:52.846192 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/must-gather-8drg4" Dec 08 21:01:52 crc kubenswrapper[4781]: I1208 21:01:52.850860 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2k9gc"/"kube-root-ca.crt" Dec 08 21:01:52 crc kubenswrapper[4781]: I1208 21:01:52.851317 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2k9gc"/"default-dockercfg-2vsfj" Dec 08 21:01:52 crc kubenswrapper[4781]: I1208 21:01:52.851517 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2k9gc"/"openshift-service-ca.crt" Dec 08 21:01:52 crc kubenswrapper[4781]: I1208 21:01:52.875808 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2k9gc/must-gather-8drg4"] Dec 08 21:01:53 crc kubenswrapper[4781]: I1208 21:01:53.018947 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/febcd94c-89fc-40bc-8e6c-28540f33048d-must-gather-output\") pod \"must-gather-8drg4\" (UID: \"febcd94c-89fc-40bc-8e6c-28540f33048d\") " pod="openshift-must-gather-2k9gc/must-gather-8drg4" Dec 08 21:01:53 crc kubenswrapper[4781]: I1208 21:01:53.019367 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc7q5\" (UniqueName: \"kubernetes.io/projected/febcd94c-89fc-40bc-8e6c-28540f33048d-kube-api-access-rc7q5\") pod \"must-gather-8drg4\" (UID: \"febcd94c-89fc-40bc-8e6c-28540f33048d\") " pod="openshift-must-gather-2k9gc/must-gather-8drg4" Dec 08 21:01:53 crc kubenswrapper[4781]: I1208 21:01:53.121503 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/febcd94c-89fc-40bc-8e6c-28540f33048d-must-gather-output\") pod \"must-gather-8drg4\" (UID: \"febcd94c-89fc-40bc-8e6c-28540f33048d\") " pod="openshift-must-gather-2k9gc/must-gather-8drg4" Dec 08 21:01:53 crc kubenswrapper[4781]: I1208 21:01:53.121623 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc7q5\" (UniqueName: \"kubernetes.io/projected/febcd94c-89fc-40bc-8e6c-28540f33048d-kube-api-access-rc7q5\") pod \"must-gather-8drg4\" (UID: \"febcd94c-89fc-40bc-8e6c-28540f33048d\") " pod="openshift-must-gather-2k9gc/must-gather-8drg4" Dec 08 21:01:53 crc kubenswrapper[4781]: I1208 21:01:53.122101 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/febcd94c-89fc-40bc-8e6c-28540f33048d-must-gather-output\") pod \"must-gather-8drg4\" (UID: \"febcd94c-89fc-40bc-8e6c-28540f33048d\") " pod="openshift-must-gather-2k9gc/must-gather-8drg4" Dec 08 21:01:53 crc kubenswrapper[4781]: I1208 21:01:53.140668 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc7q5\" (UniqueName: \"kubernetes.io/projected/febcd94c-89fc-40bc-8e6c-28540f33048d-kube-api-access-rc7q5\") pod \"must-gather-8drg4\" (UID: \"febcd94c-89fc-40bc-8e6c-28540f33048d\") " pod="openshift-must-gather-2k9gc/must-gather-8drg4" Dec 08 21:01:53 crc kubenswrapper[4781]: I1208 21:01:53.192950 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/must-gather-8drg4" Dec 08 21:01:53 crc kubenswrapper[4781]: I1208 21:01:53.657873 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2k9gc/must-gather-8drg4"] Dec 08 21:01:53 crc kubenswrapper[4781]: W1208 21:01:53.669838 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfebcd94c_89fc_40bc_8e6c_28540f33048d.slice/crio-e9d0ac88a563da64b4b3471a3b7fc6bd082098c397c5c2c8f08a00da1d6a1a9c WatchSource:0}: Error finding container e9d0ac88a563da64b4b3471a3b7fc6bd082098c397c5c2c8f08a00da1d6a1a9c: Status 404 returned error can't find the container with id e9d0ac88a563da64b4b3471a3b7fc6bd082098c397c5c2c8f08a00da1d6a1a9c Dec 08 21:01:54 crc kubenswrapper[4781]: I1208 21:01:54.648035 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2k9gc/must-gather-8drg4" event={"ID":"febcd94c-89fc-40bc-8e6c-28540f33048d","Type":"ContainerStarted","Data":"e9d0ac88a563da64b4b3471a3b7fc6bd082098c397c5c2c8f08a00da1d6a1a9c"} Dec 08 21:01:58 crc kubenswrapper[4781]: I1208 21:01:58.685532 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2k9gc/must-gather-8drg4" event={"ID":"febcd94c-89fc-40bc-8e6c-28540f33048d","Type":"ContainerStarted","Data":"146bcfa39c40b383f56c41bfa480cdb44806504aa672ef41db98fb3b105f4b4c"} Dec 08 21:01:58 crc kubenswrapper[4781]: I1208 21:01:58.686267 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2k9gc/must-gather-8drg4" event={"ID":"febcd94c-89fc-40bc-8e6c-28540f33048d","Type":"ContainerStarted","Data":"f2ed68cde998eff3c986f091181418fbdec039a30accebad5e1c3b2d5b191ecb"} Dec 08 21:01:58 crc kubenswrapper[4781]: I1208 21:01:58.709525 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2k9gc/must-gather-8drg4" podStartSLOduration=2.551801966 podStartE2EDuration="6.709507829s" podCreationTimestamp="2025-12-08 21:01:52 +0000 UTC" firstStartedPulling="2025-12-08 21:01:53.674758126 +0000 UTC m=+3429.826041503" lastFinishedPulling="2025-12-08 21:01:57.832463989 +0000 UTC m=+3433.983747366" observedRunningTime="2025-12-08 21:01:58.699390619 +0000 UTC m=+3434.850673996" watchObservedRunningTime="2025-12-08 21:01:58.709507829 +0000 UTC m=+3434.860791206" Dec 08 21:02:01 crc kubenswrapper[4781]: I1208 21:02:01.701365 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2k9gc/crc-debug-tkkb6"] Dec 08 21:02:01 crc kubenswrapper[4781]: I1208 21:02:01.702800 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" Dec 08 21:02:01 crc kubenswrapper[4781]: I1208 21:02:01.795601 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2mbx\" (UniqueName: \"kubernetes.io/projected/555db3d3-6462-4506-9590-0a8550cfde91-kube-api-access-w2mbx\") pod \"crc-debug-tkkb6\" (UID: \"555db3d3-6462-4506-9590-0a8550cfde91\") " pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" Dec 08 21:02:01 crc kubenswrapper[4781]: I1208 21:02:01.796008 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/555db3d3-6462-4506-9590-0a8550cfde91-host\") pod \"crc-debug-tkkb6\" (UID: \"555db3d3-6462-4506-9590-0a8550cfde91\") " pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" Dec 08 21:02:01 crc kubenswrapper[4781]: I1208 21:02:01.897962 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/555db3d3-6462-4506-9590-0a8550cfde91-host\") pod \"crc-debug-tkkb6\" (UID: \"555db3d3-6462-4506-9590-0a8550cfde91\") " pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" Dec 08 21:02:01 crc kubenswrapper[4781]: I1208 21:02:01.898082 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2mbx\" (UniqueName: \"kubernetes.io/projected/555db3d3-6462-4506-9590-0a8550cfde91-kube-api-access-w2mbx\") pod \"crc-debug-tkkb6\" (UID: \"555db3d3-6462-4506-9590-0a8550cfde91\") " pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" Dec 08 21:02:01 crc kubenswrapper[4781]: I1208 21:02:01.898322 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/555db3d3-6462-4506-9590-0a8550cfde91-host\") pod \"crc-debug-tkkb6\" (UID: \"555db3d3-6462-4506-9590-0a8550cfde91\") " pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" Dec 08 21:02:01 crc kubenswrapper[4781]: I1208 21:02:01.917116 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2mbx\" (UniqueName: \"kubernetes.io/projected/555db3d3-6462-4506-9590-0a8550cfde91-kube-api-access-w2mbx\") pod \"crc-debug-tkkb6\" (UID: \"555db3d3-6462-4506-9590-0a8550cfde91\") " pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" Dec 08 21:02:02 crc kubenswrapper[4781]: I1208 21:02:02.028634 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" Dec 08 21:02:02 crc kubenswrapper[4781]: W1208 21:02:02.089836 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod555db3d3_6462_4506_9590_0a8550cfde91.slice/crio-4394bdb02831092cb8e161bb15ef939bd5f7cc05e8882fe2f26ada9738eacc9a WatchSource:0}: Error finding container 4394bdb02831092cb8e161bb15ef939bd5f7cc05e8882fe2f26ada9738eacc9a: Status 404 returned error can't find the container with id 4394bdb02831092cb8e161bb15ef939bd5f7cc05e8882fe2f26ada9738eacc9a Dec 08 21:02:02 crc kubenswrapper[4781]: I1208 21:02:02.128299 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 21:02:02 crc kubenswrapper[4781]: E1208 21:02:02.128537 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:02:02 crc kubenswrapper[4781]: I1208 21:02:02.727717 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" event={"ID":"555db3d3-6462-4506-9590-0a8550cfde91","Type":"ContainerStarted","Data":"4394bdb02831092cb8e161bb15ef939bd5f7cc05e8882fe2f26ada9738eacc9a"} Dec 08 21:02:03 crc kubenswrapper[4781]: E1208 21:02:03.659854 4781 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.159:49340->38.102.83.159:36535: read tcp 38.102.83.159:49340->38.102.83.159:36535: read: connection reset by peer Dec 08 21:02:14 crc kubenswrapper[4781]: I1208 21:02:14.893352 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" event={"ID":"555db3d3-6462-4506-9590-0a8550cfde91","Type":"ContainerStarted","Data":"9a67bd3c85e6849b6e485178a0e5908f6b52a5b3f7a79e3a7b641ab5a03eca4d"} Dec 08 21:02:14 crc kubenswrapper[4781]: I1208 21:02:14.922869 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" podStartSLOduration=1.788861466 podStartE2EDuration="13.922847452s" podCreationTimestamp="2025-12-08 21:02:01 +0000 UTC" firstStartedPulling="2025-12-08 21:02:02.091977757 +0000 UTC m=+3438.243261144" lastFinishedPulling="2025-12-08 21:02:14.225963753 +0000 UTC m=+3450.377247130" observedRunningTime="2025-12-08 21:02:14.915986016 +0000 UTC m=+3451.067269403" watchObservedRunningTime="2025-12-08 21:02:14.922847452 +0000 UTC m=+3451.074130829" Dec 08 21:02:17 crc kubenswrapper[4781]: I1208 21:02:17.125525 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 21:02:17 crc kubenswrapper[4781]: E1208 21:02:17.126845 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:02:31 crc kubenswrapper[4781]: I1208 21:02:31.126696 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 21:02:32 crc kubenswrapper[4781]: I1208 21:02:32.043521 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"27ff1a6c014cb26a25650d45b4f247e2b1d66fd533143c98c5d1073d95951d01"} Dec 08 21:02:54 crc kubenswrapper[4781]: I1208 21:02:54.274991 4781 generic.go:334] "Generic (PLEG): container finished" podID="555db3d3-6462-4506-9590-0a8550cfde91" containerID="9a67bd3c85e6849b6e485178a0e5908f6b52a5b3f7a79e3a7b641ab5a03eca4d" exitCode=0 Dec 08 21:02:54 crc kubenswrapper[4781]: I1208 21:02:54.275123 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" event={"ID":"555db3d3-6462-4506-9590-0a8550cfde91","Type":"ContainerDied","Data":"9a67bd3c85e6849b6e485178a0e5908f6b52a5b3f7a79e3a7b641ab5a03eca4d"} Dec 08 21:02:55 crc kubenswrapper[4781]: I1208 21:02:55.382996 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" Dec 08 21:02:55 crc kubenswrapper[4781]: I1208 21:02:55.417800 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2k9gc/crc-debug-tkkb6"] Dec 08 21:02:55 crc kubenswrapper[4781]: I1208 21:02:55.425438 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2k9gc/crc-debug-tkkb6"] Dec 08 21:02:55 crc kubenswrapper[4781]: I1208 21:02:55.435886 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2mbx\" (UniqueName: \"kubernetes.io/projected/555db3d3-6462-4506-9590-0a8550cfde91-kube-api-access-w2mbx\") pod \"555db3d3-6462-4506-9590-0a8550cfde91\" (UID: \"555db3d3-6462-4506-9590-0a8550cfde91\") " Dec 08 21:02:55 crc kubenswrapper[4781]: I1208 21:02:55.436037 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/555db3d3-6462-4506-9590-0a8550cfde91-host\") pod \"555db3d3-6462-4506-9590-0a8550cfde91\" (UID: \"555db3d3-6462-4506-9590-0a8550cfde91\") " Dec 08 21:02:55 crc kubenswrapper[4781]: I1208 21:02:55.436218 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/555db3d3-6462-4506-9590-0a8550cfde91-host" (OuterVolumeSpecName: "host") pod "555db3d3-6462-4506-9590-0a8550cfde91" (UID: "555db3d3-6462-4506-9590-0a8550cfde91"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 21:02:55 crc kubenswrapper[4781]: I1208 21:02:55.448231 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555db3d3-6462-4506-9590-0a8550cfde91-kube-api-access-w2mbx" (OuterVolumeSpecName: "kube-api-access-w2mbx") pod "555db3d3-6462-4506-9590-0a8550cfde91" (UID: "555db3d3-6462-4506-9590-0a8550cfde91"). InnerVolumeSpecName "kube-api-access-w2mbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:02:55 crc kubenswrapper[4781]: I1208 21:02:55.537294 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/555db3d3-6462-4506-9590-0a8550cfde91-host\") on node \"crc\" DevicePath \"\"" Dec 08 21:02:55 crc kubenswrapper[4781]: I1208 21:02:55.537323 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2mbx\" (UniqueName: \"kubernetes.io/projected/555db3d3-6462-4506-9590-0a8550cfde91-kube-api-access-w2mbx\") on node \"crc\" DevicePath \"\"" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.142381 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="555db3d3-6462-4506-9590-0a8550cfde91" path="/var/lib/kubelet/pods/555db3d3-6462-4506-9590-0a8550cfde91/volumes" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.302386 4781 scope.go:117] "RemoveContainer" containerID="9a67bd3c85e6849b6e485178a0e5908f6b52a5b3f7a79e3a7b641ab5a03eca4d" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.302595 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/crc-debug-tkkb6" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.611340 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2k9gc/crc-debug-7c5lq"] Dec 08 21:02:56 crc kubenswrapper[4781]: E1208 21:02:56.612549 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555db3d3-6462-4506-9590-0a8550cfde91" containerName="container-00" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.612582 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="555db3d3-6462-4506-9590-0a8550cfde91" containerName="container-00" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.613109 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="555db3d3-6462-4506-9590-0a8550cfde91" containerName="container-00" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.614361 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/crc-debug-7c5lq" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.662853 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28fcc335-0d07-4383-a8d5-02759eae054c-host\") pod \"crc-debug-7c5lq\" (UID: \"28fcc335-0d07-4383-a8d5-02759eae054c\") " pod="openshift-must-gather-2k9gc/crc-debug-7c5lq" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.663093 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvrwn\" (UniqueName: \"kubernetes.io/projected/28fcc335-0d07-4383-a8d5-02759eae054c-kube-api-access-hvrwn\") pod \"crc-debug-7c5lq\" (UID: \"28fcc335-0d07-4383-a8d5-02759eae054c\") " pod="openshift-must-gather-2k9gc/crc-debug-7c5lq" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.764258 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28fcc335-0d07-4383-a8d5-02759eae054c-host\") pod \"crc-debug-7c5lq\" (UID: \"28fcc335-0d07-4383-a8d5-02759eae054c\") " pod="openshift-must-gather-2k9gc/crc-debug-7c5lq" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.764414 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvrwn\" (UniqueName: \"kubernetes.io/projected/28fcc335-0d07-4383-a8d5-02759eae054c-kube-api-access-hvrwn\") pod \"crc-debug-7c5lq\" (UID: \"28fcc335-0d07-4383-a8d5-02759eae054c\") " pod="openshift-must-gather-2k9gc/crc-debug-7c5lq" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.764502 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28fcc335-0d07-4383-a8d5-02759eae054c-host\") pod \"crc-debug-7c5lq\" (UID: \"28fcc335-0d07-4383-a8d5-02759eae054c\") " pod="openshift-must-gather-2k9gc/crc-debug-7c5lq" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.801611 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvrwn\" (UniqueName: \"kubernetes.io/projected/28fcc335-0d07-4383-a8d5-02759eae054c-kube-api-access-hvrwn\") pod \"crc-debug-7c5lq\" (UID: \"28fcc335-0d07-4383-a8d5-02759eae054c\") " pod="openshift-must-gather-2k9gc/crc-debug-7c5lq" Dec 08 21:02:56 crc kubenswrapper[4781]: I1208 21:02:56.936768 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/crc-debug-7c5lq" Dec 08 21:02:57 crc kubenswrapper[4781]: I1208 21:02:57.311620 4781 generic.go:334] "Generic (PLEG): container finished" podID="28fcc335-0d07-4383-a8d5-02759eae054c" containerID="27123a67076dc90b356b939a50fe67812b20a611e95479810daa55b003121427" exitCode=0 Dec 08 21:02:57 crc kubenswrapper[4781]: I1208 21:02:57.311749 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2k9gc/crc-debug-7c5lq" event={"ID":"28fcc335-0d07-4383-a8d5-02759eae054c","Type":"ContainerDied","Data":"27123a67076dc90b356b939a50fe67812b20a611e95479810daa55b003121427"} Dec 08 21:02:57 crc kubenswrapper[4781]: I1208 21:02:57.311986 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2k9gc/crc-debug-7c5lq" event={"ID":"28fcc335-0d07-4383-a8d5-02759eae054c","Type":"ContainerStarted","Data":"f140943a580dff3674e67c18403024d5e5d9f2f7b084f6010852dd98152e40bc"} Dec 08 21:02:57 crc kubenswrapper[4781]: I1208 21:02:57.806700 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2k9gc/crc-debug-7c5lq"] Dec 08 21:02:57 crc kubenswrapper[4781]: I1208 21:02:57.814659 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2k9gc/crc-debug-7c5lq"] Dec 08 21:02:58 crc kubenswrapper[4781]: I1208 21:02:58.444313 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/crc-debug-7c5lq" Dec 08 21:02:58 crc kubenswrapper[4781]: I1208 21:02:58.592829 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvrwn\" (UniqueName: \"kubernetes.io/projected/28fcc335-0d07-4383-a8d5-02759eae054c-kube-api-access-hvrwn\") pod \"28fcc335-0d07-4383-a8d5-02759eae054c\" (UID: \"28fcc335-0d07-4383-a8d5-02759eae054c\") " Dec 08 21:02:58 crc kubenswrapper[4781]: I1208 21:02:58.593526 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28fcc335-0d07-4383-a8d5-02759eae054c-host\") pod \"28fcc335-0d07-4383-a8d5-02759eae054c\" (UID: \"28fcc335-0d07-4383-a8d5-02759eae054c\") " Dec 08 21:02:58 crc kubenswrapper[4781]: I1208 21:02:58.593628 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28fcc335-0d07-4383-a8d5-02759eae054c-host" (OuterVolumeSpecName: "host") pod "28fcc335-0d07-4383-a8d5-02759eae054c" (UID: "28fcc335-0d07-4383-a8d5-02759eae054c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 21:02:58 crc kubenswrapper[4781]: I1208 21:02:58.595133 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28fcc335-0d07-4383-a8d5-02759eae054c-host\") on node \"crc\" DevicePath \"\"" Dec 08 21:02:58 crc kubenswrapper[4781]: I1208 21:02:58.598676 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28fcc335-0d07-4383-a8d5-02759eae054c-kube-api-access-hvrwn" (OuterVolumeSpecName: "kube-api-access-hvrwn") pod "28fcc335-0d07-4383-a8d5-02759eae054c" (UID: "28fcc335-0d07-4383-a8d5-02759eae054c"). InnerVolumeSpecName "kube-api-access-hvrwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:02:58 crc kubenswrapper[4781]: I1208 21:02:58.699183 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvrwn\" (UniqueName: \"kubernetes.io/projected/28fcc335-0d07-4383-a8d5-02759eae054c-kube-api-access-hvrwn\") on node \"crc\" DevicePath \"\"" Dec 08 21:02:59 crc kubenswrapper[4781]: I1208 21:02:59.023204 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2k9gc/crc-debug-bz7cv"] Dec 08 21:02:59 crc kubenswrapper[4781]: E1208 21:02:59.023563 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28fcc335-0d07-4383-a8d5-02759eae054c" containerName="container-00" Dec 08 21:02:59 crc kubenswrapper[4781]: I1208 21:02:59.023578 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="28fcc335-0d07-4383-a8d5-02759eae054c" containerName="container-00" Dec 08 21:02:59 crc kubenswrapper[4781]: I1208 21:02:59.023778 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="28fcc335-0d07-4383-a8d5-02759eae054c" containerName="container-00" Dec 08 21:02:59 crc kubenswrapper[4781]: I1208 21:02:59.024350 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/crc-debug-bz7cv" Dec 08 21:02:59 crc kubenswrapper[4781]: I1208 21:02:59.108798 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6555b8b-f509-4d68-839f-76927f356027-host\") pod \"crc-debug-bz7cv\" (UID: \"c6555b8b-f509-4d68-839f-76927f356027\") " pod="openshift-must-gather-2k9gc/crc-debug-bz7cv" Dec 08 21:02:59 crc kubenswrapper[4781]: I1208 21:02:59.108873 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbknb\" (UniqueName: \"kubernetes.io/projected/c6555b8b-f509-4d68-839f-76927f356027-kube-api-access-cbknb\") pod \"crc-debug-bz7cv\" (UID: \"c6555b8b-f509-4d68-839f-76927f356027\") " pod="openshift-must-gather-2k9gc/crc-debug-bz7cv" Dec 08 21:02:59 crc kubenswrapper[4781]: I1208 21:02:59.210861 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6555b8b-f509-4d68-839f-76927f356027-host\") pod \"crc-debug-bz7cv\" (UID: \"c6555b8b-f509-4d68-839f-76927f356027\") " pod="openshift-must-gather-2k9gc/crc-debug-bz7cv" Dec 08 21:02:59 crc kubenswrapper[4781]: I1208 21:02:59.211018 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6555b8b-f509-4d68-839f-76927f356027-host\") pod \"crc-debug-bz7cv\" (UID: \"c6555b8b-f509-4d68-839f-76927f356027\") " pod="openshift-must-gather-2k9gc/crc-debug-bz7cv" Dec 08 21:02:59 crc kubenswrapper[4781]: I1208 21:02:59.211393 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbknb\" (UniqueName: \"kubernetes.io/projected/c6555b8b-f509-4d68-839f-76927f356027-kube-api-access-cbknb\") pod \"crc-debug-bz7cv\" (UID: \"c6555b8b-f509-4d68-839f-76927f356027\") " pod="openshift-must-gather-2k9gc/crc-debug-bz7cv" Dec 08 21:02:59 crc kubenswrapper[4781]: I1208 21:02:59.239754 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbknb\" (UniqueName: \"kubernetes.io/projected/c6555b8b-f509-4d68-839f-76927f356027-kube-api-access-cbknb\") pod \"crc-debug-bz7cv\" (UID: \"c6555b8b-f509-4d68-839f-76927f356027\") " pod="openshift-must-gather-2k9gc/crc-debug-bz7cv" Dec 08 21:02:59 crc kubenswrapper[4781]: I1208 21:02:59.337272 4781 scope.go:117] "RemoveContainer" containerID="27123a67076dc90b356b939a50fe67812b20a611e95479810daa55b003121427" Dec 08 21:02:59 crc kubenswrapper[4781]: I1208 21:02:59.337316 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/crc-debug-7c5lq" Dec 08 21:02:59 crc kubenswrapper[4781]: I1208 21:02:59.347157 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/crc-debug-bz7cv" Dec 08 21:02:59 crc kubenswrapper[4781]: W1208 21:02:59.398415 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6555b8b_f509_4d68_839f_76927f356027.slice/crio-633008c79b461fc193be454cc90f54ece29c61d18e03d9b3b45163654e93474d WatchSource:0}: Error finding container 633008c79b461fc193be454cc90f54ece29c61d18e03d9b3b45163654e93474d: Status 404 returned error can't find the container with id 633008c79b461fc193be454cc90f54ece29c61d18e03d9b3b45163654e93474d Dec 08 21:03:00 crc kubenswrapper[4781]: I1208 21:03:00.139237 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28fcc335-0d07-4383-a8d5-02759eae054c" path="/var/lib/kubelet/pods/28fcc335-0d07-4383-a8d5-02759eae054c/volumes" Dec 08 21:03:00 crc kubenswrapper[4781]: I1208 21:03:00.349167 4781 generic.go:334] "Generic (PLEG): container finished" podID="c6555b8b-f509-4d68-839f-76927f356027" containerID="cbae0b7ccfa0d058a57aadd6c02ce2d3de96ac8314e9ff734c6dfe787886f22d" exitCode=0 Dec 08 21:03:00 crc kubenswrapper[4781]: I1208 21:03:00.349209 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2k9gc/crc-debug-bz7cv" event={"ID":"c6555b8b-f509-4d68-839f-76927f356027","Type":"ContainerDied","Data":"cbae0b7ccfa0d058a57aadd6c02ce2d3de96ac8314e9ff734c6dfe787886f22d"} Dec 08 21:03:00 crc kubenswrapper[4781]: I1208 21:03:00.349236 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2k9gc/crc-debug-bz7cv" event={"ID":"c6555b8b-f509-4d68-839f-76927f356027","Type":"ContainerStarted","Data":"633008c79b461fc193be454cc90f54ece29c61d18e03d9b3b45163654e93474d"} Dec 08 21:03:00 crc kubenswrapper[4781]: I1208 21:03:00.393031 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2k9gc/crc-debug-bz7cv"] Dec 08 21:03:00 crc kubenswrapper[4781]: I1208 21:03:00.400963 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2k9gc/crc-debug-bz7cv"] Dec 08 21:03:01 crc kubenswrapper[4781]: I1208 21:03:01.452805 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/crc-debug-bz7cv" Dec 08 21:03:01 crc kubenswrapper[4781]: I1208 21:03:01.461511 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbknb\" (UniqueName: \"kubernetes.io/projected/c6555b8b-f509-4d68-839f-76927f356027-kube-api-access-cbknb\") pod \"c6555b8b-f509-4d68-839f-76927f356027\" (UID: \"c6555b8b-f509-4d68-839f-76927f356027\") " Dec 08 21:03:01 crc kubenswrapper[4781]: I1208 21:03:01.461890 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6555b8b-f509-4d68-839f-76927f356027-host\") pod \"c6555b8b-f509-4d68-839f-76927f356027\" (UID: \"c6555b8b-f509-4d68-839f-76927f356027\") " Dec 08 21:03:01 crc kubenswrapper[4781]: I1208 21:03:01.461993 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6555b8b-f509-4d68-839f-76927f356027-host" (OuterVolumeSpecName: "host") pod "c6555b8b-f509-4d68-839f-76927f356027" (UID: "c6555b8b-f509-4d68-839f-76927f356027"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 21:03:01 crc kubenswrapper[4781]: I1208 21:03:01.462587 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6555b8b-f509-4d68-839f-76927f356027-host\") on node \"crc\" DevicePath \"\"" Dec 08 21:03:01 crc kubenswrapper[4781]: I1208 21:03:01.466710 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6555b8b-f509-4d68-839f-76927f356027-kube-api-access-cbknb" (OuterVolumeSpecName: "kube-api-access-cbknb") pod "c6555b8b-f509-4d68-839f-76927f356027" (UID: "c6555b8b-f509-4d68-839f-76927f356027"). InnerVolumeSpecName "kube-api-access-cbknb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:03:01 crc kubenswrapper[4781]: I1208 21:03:01.564359 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbknb\" (UniqueName: \"kubernetes.io/projected/c6555b8b-f509-4d68-839f-76927f356027-kube-api-access-cbknb\") on node \"crc\" DevicePath \"\"" Dec 08 21:03:02 crc kubenswrapper[4781]: I1208 21:03:02.136069 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6555b8b-f509-4d68-839f-76927f356027" path="/var/lib/kubelet/pods/c6555b8b-f509-4d68-839f-76927f356027/volumes" Dec 08 21:03:02 crc kubenswrapper[4781]: I1208 21:03:02.366020 4781 scope.go:117] "RemoveContainer" containerID="cbae0b7ccfa0d058a57aadd6c02ce2d3de96ac8314e9ff734c6dfe787886f22d" Dec 08 21:03:02 crc kubenswrapper[4781]: I1208 21:03:02.366040 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/crc-debug-bz7cv" Dec 08 21:03:15 crc kubenswrapper[4781]: I1208 21:03:15.660939 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56858f8966-hkmxw_ed36d878-8827-467c-95a0-1450798ad50e/barbican-api/0.log" Dec 08 21:03:15 crc kubenswrapper[4781]: I1208 21:03:15.767064 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56858f8966-hkmxw_ed36d878-8827-467c-95a0-1450798ad50e/barbican-api-log/0.log" Dec 08 21:03:15 crc kubenswrapper[4781]: I1208 21:03:15.909138 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6d674cb658-5bj7r_dcaa98ab-e000-41ef-bf28-189680138c66/barbican-keystone-listener/0.log" Dec 08 21:03:15 crc kubenswrapper[4781]: I1208 21:03:15.977562 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6d674cb658-5bj7r_dcaa98ab-e000-41ef-bf28-189680138c66/barbican-keystone-listener-log/0.log" Dec 08 21:03:16 crc kubenswrapper[4781]: I1208 21:03:16.051370 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-676d96fbc7-6xfzf_558023e1-d94c-4422-a958-796ba9bf387f/barbican-worker/0.log" Dec 08 21:03:16 crc kubenswrapper[4781]: I1208 21:03:16.115677 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-676d96fbc7-6xfzf_558023e1-d94c-4422-a958-796ba9bf387f/barbican-worker-log/0.log" Dec 08 21:03:16 crc kubenswrapper[4781]: I1208 21:03:16.184205 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq_3eca1a1d-a60c-4911-9cf8-fd8a82f9541c/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:16 crc kubenswrapper[4781]: I1208 21:03:16.352547 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9447ae7c-50db-46a0-aeec-7718944d900e/ceilometer-central-agent/0.log" Dec 08 21:03:16 crc kubenswrapper[4781]: I1208 21:03:16.391992 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9447ae7c-50db-46a0-aeec-7718944d900e/ceilometer-notification-agent/0.log" Dec 08 21:03:16 crc kubenswrapper[4781]: I1208 21:03:16.398417 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9447ae7c-50db-46a0-aeec-7718944d900e/proxy-httpd/0.log" Dec 08 21:03:16 crc kubenswrapper[4781]: I1208 21:03:16.499447 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9447ae7c-50db-46a0-aeec-7718944d900e/sg-core/0.log" Dec 08 21:03:16 crc kubenswrapper[4781]: I1208 21:03:16.578409 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5c1478d5-3e67-4d45-b4c3-d5612e46db8d/cinder-api-log/0.log" Dec 08 21:03:16 crc kubenswrapper[4781]: I1208 21:03:16.648496 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5c1478d5-3e67-4d45-b4c3-d5612e46db8d/cinder-api/0.log" Dec 08 21:03:16 crc kubenswrapper[4781]: I1208 21:03:16.805020 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ce9b542d-77aa-4a8d-95fe-393a5a0dafa2/cinder-scheduler/0.log" Dec 08 21:03:16 crc kubenswrapper[4781]: I1208 21:03:16.813032 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ce9b542d-77aa-4a8d-95fe-393a5a0dafa2/probe/0.log" Dec 08 21:03:16 crc kubenswrapper[4781]: I1208 21:03:16.919554 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d_b174f924-296a-45a3-b80b-fdec0f219fa8/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:17 crc kubenswrapper[4781]: I1208 21:03:17.025017 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hbzns_c7f682bd-1ad1-4917-8c54-7f76ef956f09/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:17 crc kubenswrapper[4781]: I1208 21:03:17.126313 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6574f55bb5-rdnzp_11821178-db83-4950-8900-5f6fcc68f184/init/0.log" Dec 08 21:03:17 crc kubenswrapper[4781]: I1208 21:03:17.303844 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6574f55bb5-rdnzp_11821178-db83-4950-8900-5f6fcc68f184/init/0.log" Dec 08 21:03:17 crc kubenswrapper[4781]: I1208 21:03:17.346178 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6574f55bb5-rdnzp_11821178-db83-4950-8900-5f6fcc68f184/dnsmasq-dns/0.log" Dec 08 21:03:17 crc kubenswrapper[4781]: I1208 21:03:17.421703 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk_aa022451-2529-456c-99bf-9c36b807312e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:17 crc kubenswrapper[4781]: I1208 21:03:17.539283 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8238f3d8-03ef-4f26-a327-0f9e931aa7a6/glance-httpd/0.log" Dec 08 21:03:17 crc kubenswrapper[4781]: I1208 21:03:17.570337 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8238f3d8-03ef-4f26-a327-0f9e931aa7a6/glance-log/0.log" Dec 08 21:03:17 crc kubenswrapper[4781]: I1208 21:03:17.758378 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_901250f8-f46c-49d0-83d3-3b0c7affea54/glance-httpd/0.log" Dec 08 21:03:17 crc kubenswrapper[4781]: I1208 21:03:17.812122 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_901250f8-f46c-49d0-83d3-3b0c7affea54/glance-log/0.log" Dec 08 21:03:18 crc kubenswrapper[4781]: I1208 21:03:18.014350 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-75dd9f77c4-85lhw_eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf/horizon/0.log" Dec 08 21:03:18 crc kubenswrapper[4781]: I1208 21:03:18.042526 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-z64vf_98d6ed01-20fc-4e72-a8cf-2e53a8e6103e/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:18 crc kubenswrapper[4781]: I1208 21:03:18.237611 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-75dd9f77c4-85lhw_eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf/horizon-log/0.log" Dec 08 21:03:18 crc kubenswrapper[4781]: I1208 21:03:18.303446 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-p9qr6_c08d5f1c-63e8-4974-b50c-29b0e8db5e9a/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:18 crc kubenswrapper[4781]: I1208 21:03:18.466003 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29420461-q4qhq_84bf4f32-efbb-4625-9863-20c2b169937a/keystone-cron/0.log" Dec 08 21:03:18 crc kubenswrapper[4781]: I1208 21:03:18.586988 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-796ff6864f-m6jf6_47dda6bf-1ed5-4df0-a647-32e518a7514f/keystone-api/0.log" Dec 08 21:03:18 crc kubenswrapper[4781]: I1208 21:03:18.775948 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_acc0a735-094b-4857-8238-5240530c62dc/kube-state-metrics/0.log" Dec 08 21:03:18 crc kubenswrapper[4781]: I1208 21:03:18.798572 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-496zk_0d819560-5e37-4cbe-8276-f5c63dd9610c/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:19 crc kubenswrapper[4781]: I1208 21:03:19.135858 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b4c587749-kmfjj_b3890267-5a89-4612-89f0-1bb7ba0e1245/neutron-api/0.log" Dec 08 21:03:19 crc kubenswrapper[4781]: I1208 21:03:19.209151 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b4c587749-kmfjj_b3890267-5a89-4612-89f0-1bb7ba0e1245/neutron-httpd/0.log" Dec 08 21:03:19 crc kubenswrapper[4781]: I1208 21:03:19.393166 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh_f0d82d95-8bf8-4845-a305-cca05358ffdb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:19 crc kubenswrapper[4781]: I1208 21:03:19.880651 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_cba17aa8-5de4-4747-bc12-50b1d1b66490/nova-api-log/0.log" Dec 08 21:03:19 crc kubenswrapper[4781]: I1208 21:03:19.940607 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5f31e53b-8234-4f29-bfcf-d3c037103945/nova-cell0-conductor-conductor/0.log" Dec 08 21:03:20 crc kubenswrapper[4781]: I1208 21:03:20.150642 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_af1b29f3-e2a8-4a48-9d18-8c502c3f435c/nova-cell1-conductor-conductor/0.log" Dec 08 21:03:20 crc kubenswrapper[4781]: I1208 21:03:20.173791 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_cba17aa8-5de4-4747-bc12-50b1d1b66490/nova-api-api/0.log" Dec 08 21:03:20 crc kubenswrapper[4781]: I1208 21:03:20.262280 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_28fbba80-b3c1-45f6-ad0a-3435a48fd033/nova-cell1-novncproxy-novncproxy/0.log" Dec 08 21:03:20 crc kubenswrapper[4781]: I1208 21:03:20.419214 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-5rsvg_69ce8819-1a24-4b28-9438-c92c07b4dbca/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:20 crc kubenswrapper[4781]: I1208 21:03:20.534042 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2348d949-d7f2-44a5-8e47-58b358d060c8/nova-metadata-log/0.log" Dec 08 21:03:20 crc kubenswrapper[4781]: I1208 21:03:20.823086 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_daf2aa43-ddc7-4618-a3df-665b947b68bd/nova-scheduler-scheduler/0.log" Dec 08 21:03:20 crc kubenswrapper[4781]: I1208 21:03:20.835421 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_95c2c3c0-0733-4bac-bf28-0805d8c9a499/mysql-bootstrap/0.log" Dec 08 21:03:21 crc kubenswrapper[4781]: I1208 21:03:21.042548 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_95c2c3c0-0733-4bac-bf28-0805d8c9a499/mysql-bootstrap/0.log" Dec 08 21:03:21 crc kubenswrapper[4781]: I1208 21:03:21.051246 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_95c2c3c0-0733-4bac-bf28-0805d8c9a499/galera/0.log" Dec 08 21:03:21 crc kubenswrapper[4781]: I1208 21:03:21.220422 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_88416662-c07f-4d9f-b9cb-7f92d21aaa6f/mysql-bootstrap/0.log" Dec 08 21:03:21 crc kubenswrapper[4781]: I1208 21:03:21.406158 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_88416662-c07f-4d9f-b9cb-7f92d21aaa6f/mysql-bootstrap/0.log" Dec 08 21:03:21 crc kubenswrapper[4781]: I1208 21:03:21.452343 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_88416662-c07f-4d9f-b9cb-7f92d21aaa6f/galera/0.log" Dec 08 21:03:21 crc kubenswrapper[4781]: I1208 21:03:21.587413 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ee967571-7083-4ff7-a035-b90fadf420ee/openstackclient/0.log" Dec 08 21:03:21 crc kubenswrapper[4781]: I1208 21:03:21.616039 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2348d949-d7f2-44a5-8e47-58b358d060c8/nova-metadata-metadata/0.log" Dec 08 21:03:21 crc kubenswrapper[4781]: I1208 21:03:21.705527 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-kf4ss_a7f5afd4-05f3-4954-9dc9-3efa47c22b85/ovn-controller/0.log" Dec 08 21:03:21 crc kubenswrapper[4781]: I1208 21:03:21.850717 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9w8ql_0d8cf2d7-e85f-49e8-95e2-c1548f506888/openstack-network-exporter/0.log" Dec 08 21:03:21 crc kubenswrapper[4781]: I1208 21:03:21.965724 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9cv7c_b675bf96-ecb2-4098-891f-6a87e0ed5140/ovsdb-server-init/0.log" Dec 08 21:03:22 crc kubenswrapper[4781]: I1208 21:03:22.088994 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9cv7c_b675bf96-ecb2-4098-891f-6a87e0ed5140/ovsdb-server-init/0.log" Dec 08 21:03:22 crc kubenswrapper[4781]: I1208 21:03:22.196077 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9cv7c_b675bf96-ecb2-4098-891f-6a87e0ed5140/ovs-vswitchd/0.log" Dec 08 21:03:22 crc kubenswrapper[4781]: I1208 21:03:22.206046 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9cv7c_b675bf96-ecb2-4098-891f-6a87e0ed5140/ovsdb-server/0.log" Dec 08 21:03:22 crc kubenswrapper[4781]: I1208 21:03:22.335054 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-4wchf_8f4acc2b-373a-48e5-916b-a0fcfcb83851/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:22 crc kubenswrapper[4781]: I1208 21:03:22.431913 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a1bbb8af-58b4-4eff-9e81-5206ecc06b2e/openstack-network-exporter/0.log" Dec 08 21:03:22 crc kubenswrapper[4781]: I1208 21:03:22.475321 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a1bbb8af-58b4-4eff-9e81-5206ecc06b2e/ovn-northd/0.log" Dec 08 21:03:22 crc kubenswrapper[4781]: I1208 21:03:22.667709 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3530fc96-f407-470c-a960-c7cfd844c517/openstack-network-exporter/0.log" Dec 08 21:03:22 crc kubenswrapper[4781]: I1208 21:03:22.691346 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3530fc96-f407-470c-a960-c7cfd844c517/ovsdbserver-nb/0.log" Dec 08 21:03:22 crc kubenswrapper[4781]: I1208 21:03:22.825863 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5544d7c5-67c2-4f2e-9e0f-d8307d831d5d/openstack-network-exporter/0.log" Dec 08 21:03:22 crc kubenswrapper[4781]: I1208 21:03:22.872696 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5544d7c5-67c2-4f2e-9e0f-d8307d831d5d/ovsdbserver-sb/0.log" Dec 08 21:03:22 crc kubenswrapper[4781]: I1208 21:03:22.952171 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7cb6b5bcfd-8jc9n_684ebfa2-2a23-4f1f-96cc-d436e63feede/placement-api/0.log" Dec 08 21:03:23 crc kubenswrapper[4781]: I1208 21:03:23.097750 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7cb6b5bcfd-8jc9n_684ebfa2-2a23-4f1f-96cc-d436e63feede/placement-log/0.log" Dec 08 21:03:23 crc kubenswrapper[4781]: I1208 21:03:23.167885 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_984eb37b-647a-4e37-b4bc-6e7a3becb3ce/setup-container/0.log" Dec 08 21:03:23 crc kubenswrapper[4781]: I1208 21:03:23.370831 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_984eb37b-647a-4e37-b4bc-6e7a3becb3ce/setup-container/0.log" Dec 08 21:03:23 crc kubenswrapper[4781]: I1208 21:03:23.400304 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_984eb37b-647a-4e37-b4bc-6e7a3becb3ce/rabbitmq/0.log" Dec 08 21:03:23 crc kubenswrapper[4781]: I1208 21:03:23.459354 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_20fc56e1-a1f6-4495-834a-41bfebf14aef/setup-container/0.log" Dec 08 21:03:23 crc kubenswrapper[4781]: I1208 21:03:23.632690 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_20fc56e1-a1f6-4495-834a-41bfebf14aef/setup-container/0.log" Dec 08 21:03:23 crc kubenswrapper[4781]: I1208 21:03:23.724368 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx_29315241-935b-40dc-b49d-d8f18cbb4d38/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:23 crc kubenswrapper[4781]: I1208 21:03:23.733590 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_20fc56e1-a1f6-4495-834a-41bfebf14aef/rabbitmq/0.log" Dec 08 21:03:23 crc kubenswrapper[4781]: I1208 21:03:23.919479 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-jr5wd_3cfbf369-bbb7-4b9e-980d-32fe2cf76c39/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:23 crc kubenswrapper[4781]: I1208 21:03:23.948434 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw_793b9e15-75d8-49b1-8261-cc624d33aaea/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:24 crc kubenswrapper[4781]: I1208 21:03:24.112784 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-xxkqq_96db8e7d-bc3a-4804-af50-6f403dbbcc26/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:24 crc kubenswrapper[4781]: I1208 21:03:24.257595 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-b4xf8_857a2e5e-21b6-450e-8578-240e91f419f7/ssh-known-hosts-edpm-deployment/0.log" Dec 08 21:03:24 crc kubenswrapper[4781]: I1208 21:03:24.423947 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7dc689b5b9-6r5rd_d6925de4-2c7d-43cb-b2a9-1ec66f56b007/proxy-server/0.log" Dec 08 21:03:24 crc kubenswrapper[4781]: I1208 21:03:24.517314 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7dc689b5b9-6r5rd_d6925de4-2c7d-43cb-b2a9-1ec66f56b007/proxy-httpd/0.log" Dec 08 21:03:24 crc kubenswrapper[4781]: I1208 21:03:24.571097 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-cdbxl_ec594735-a472-4c13-b98b-453a80fceb1d/swift-ring-rebalance/0.log" Dec 08 21:03:24 crc kubenswrapper[4781]: I1208 21:03:24.833776 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/account-auditor/0.log" Dec 08 21:03:24 crc kubenswrapper[4781]: I1208 21:03:24.885624 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/account-reaper/0.log" Dec 08 21:03:24 crc kubenswrapper[4781]: I1208 21:03:24.977856 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/account-replicator/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.066363 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/account-server/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.086750 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/container-auditor/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.089541 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/container-replicator/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.208462 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/container-server/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.258526 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/container-updater/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.293441 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/object-expirer/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.332169 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/object-auditor/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.449649 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/object-replicator/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.454154 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/object-server/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.467651 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/object-updater/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.530378 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/rsync/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.663747 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/swift-recon-cron/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.752039 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf_9c03b281-e533-4108-9eff-0930b52141ca/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.924387 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e2b97db1-1e2a-45e9-b959-fde154131c3b/tempest-tests-tempest-tests-runner/0.log" Dec 08 21:03:25 crc kubenswrapper[4781]: I1208 21:03:25.930334 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_71062a6c-6abb-416f-9c98-a98fb20ad2ad/test-operator-logs-container/0.log" Dec 08 21:03:26 crc kubenswrapper[4781]: I1208 21:03:26.130150 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7_e9261e8f-a212-4633-bc8d-06c952d3dc9f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:03:33 crc kubenswrapper[4781]: I1208 21:03:33.946040 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3be01047-1cc6-4ed4-9d41-68b1f67f7a11/memcached/0.log" Dec 08 21:03:50 crc kubenswrapper[4781]: I1208 21:03:50.585875 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/util/0.log" Dec 08 21:03:50 crc kubenswrapper[4781]: I1208 21:03:50.939228 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/util/0.log" Dec 08 21:03:50 crc kubenswrapper[4781]: I1208 21:03:50.948834 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/pull/0.log" Dec 08 21:03:51 crc kubenswrapper[4781]: I1208 21:03:51.011853 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/pull/0.log" Dec 08 21:03:51 crc kubenswrapper[4781]: I1208 21:03:51.192609 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/util/0.log" Dec 08 21:03:51 crc kubenswrapper[4781]: I1208 21:03:51.350660 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/pull/0.log" Dec 08 21:03:51 crc kubenswrapper[4781]: I1208 21:03:51.357154 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/extract/0.log" Dec 08 21:03:51 crc kubenswrapper[4781]: I1208 21:03:51.532404 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-spkzc_33445aa2-3e5b-4b50-ba7a-0f86d08dd64d/kube-rbac-proxy/0.log" Dec 08 21:03:51 crc kubenswrapper[4781]: I1208 21:03:51.569485 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-tk48g_17686434-377d-4a2f-b25e-e0074d2e06c6/kube-rbac-proxy/0.log" Dec 08 21:03:51 crc kubenswrapper[4781]: I1208 21:03:51.610975 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-spkzc_33445aa2-3e5b-4b50-ba7a-0f86d08dd64d/manager/0.log" Dec 08 21:03:51 crc kubenswrapper[4781]: I1208 21:03:51.860762 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-tk48g_17686434-377d-4a2f-b25e-e0074d2e06c6/manager/0.log" Dec 08 21:03:51 crc kubenswrapper[4781]: I1208 21:03:51.928093 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-cg5z2_c1ed3c21-a6cb-43f6-a018-8bead69a5439/kube-rbac-proxy/0.log" Dec 08 21:03:51 crc kubenswrapper[4781]: I1208 21:03:51.983047 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-cg5z2_c1ed3c21-a6cb-43f6-a018-8bead69a5439/manager/0.log" Dec 08 21:03:52 crc kubenswrapper[4781]: I1208 21:03:52.039276 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-64kcb_b44e4d42-05a5-42e2-8a45-5d5506fbbb23/kube-rbac-proxy/0.log" Dec 08 21:03:52 crc kubenswrapper[4781]: I1208 21:03:52.172607 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-64kcb_b44e4d42-05a5-42e2-8a45-5d5506fbbb23/manager/0.log" Dec 08 21:03:52 crc kubenswrapper[4781]: I1208 21:03:52.214521 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-7dhp5_5e629e44-f8b6-410a-baa9-b076e609686c/kube-rbac-proxy/0.log" Dec 08 21:03:52 crc kubenswrapper[4781]: I1208 21:03:52.239572 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-7dhp5_5e629e44-f8b6-410a-baa9-b076e609686c/manager/0.log" Dec 08 21:03:52 crc kubenswrapper[4781]: I1208 21:03:52.366978 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-bkrvn_76fef842-95b4-47f1-9c34-a4edc70a3cbf/kube-rbac-proxy/0.log" Dec 08 21:03:52 crc kubenswrapper[4781]: I1208 21:03:52.424034 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-bkrvn_76fef842-95b4-47f1-9c34-a4edc70a3cbf/manager/0.log" Dec 08 21:03:52 crc kubenswrapper[4781]: I1208 21:03:52.548890 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-5mbrr_47e01596-c50b-44f5-82fb-1b6c7a005d10/kube-rbac-proxy/0.log" Dec 08 21:03:52 crc kubenswrapper[4781]: I1208 21:03:52.787779 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-cv24b_97f1308f-60a9-4e7f-b029-8bc13246ba9e/manager/0.log" Dec 08 21:03:52 crc kubenswrapper[4781]: I1208 21:03:52.795762 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-cv24b_97f1308f-60a9-4e7f-b029-8bc13246ba9e/kube-rbac-proxy/0.log" Dec 08 21:03:52 crc kubenswrapper[4781]: I1208 21:03:52.903444 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-5mbrr_47e01596-c50b-44f5-82fb-1b6c7a005d10/manager/0.log" Dec 08 21:03:52 crc kubenswrapper[4781]: I1208 21:03:52.968058 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-fjhrn_344dd244-42be-4538-92c0-ab4be8f8a093/kube-rbac-proxy/0.log" Dec 08 21:03:53 crc kubenswrapper[4781]: I1208 21:03:53.105070 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-fjhrn_344dd244-42be-4538-92c0-ab4be8f8a093/manager/0.log" Dec 08 21:03:53 crc kubenswrapper[4781]: I1208 21:03:53.152159 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-swkcz_51e5598f-4979-4f4d-a947-323c19dd3102/kube-rbac-proxy/0.log" Dec 08 21:03:53 crc kubenswrapper[4781]: I1208 21:03:53.182443 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-swkcz_51e5598f-4979-4f4d-a947-323c19dd3102/manager/0.log" Dec 08 21:03:53 crc kubenswrapper[4781]: I1208 21:03:53.333275 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-m9tns_2c1ab608-2b41-451a-b6b9-2cf867ab289b/kube-rbac-proxy/0.log" Dec 08 21:03:53 crc kubenswrapper[4781]: I1208 21:03:53.399808 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-m9tns_2c1ab608-2b41-451a-b6b9-2cf867ab289b/manager/0.log" Dec 08 21:03:53 crc kubenswrapper[4781]: I1208 21:03:53.495783 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-zkslx_3b159a78-5c17-430a-ac90-f1d4e7fac757/kube-rbac-proxy/0.log" Dec 08 21:03:53 crc kubenswrapper[4781]: I1208 21:03:53.558161 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-zkslx_3b159a78-5c17-430a-ac90-f1d4e7fac757/manager/0.log" Dec 08 21:03:53 crc kubenswrapper[4781]: I1208 21:03:53.626387 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p7f7t_e00e8983-d123-42e8-a4ef-2a2bbda78cde/kube-rbac-proxy/0.log" Dec 08 21:03:53 crc kubenswrapper[4781]: I1208 21:03:53.769550 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p7f7t_e00e8983-d123-42e8-a4ef-2a2bbda78cde/manager/0.log" Dec 08 21:03:53 crc kubenswrapper[4781]: I1208 21:03:53.862442 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-fql79_78753ed6-755b-4e63-8026-50722a9637a9/kube-rbac-proxy/0.log" Dec 08 21:03:53 crc kubenswrapper[4781]: I1208 21:03:53.897151 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-fql79_78753ed6-755b-4e63-8026-50722a9637a9/manager/0.log" Dec 08 21:03:54 crc kubenswrapper[4781]: I1208 21:03:54.020100 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-744f8cb766hj2j5_0bafa66c-7cf8-40eb-ae15-a4365fbe3176/kube-rbac-proxy/0.log" Dec 08 21:03:54 crc kubenswrapper[4781]: I1208 21:03:54.059864 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-744f8cb766hj2j5_0bafa66c-7cf8-40eb-ae15-a4365fbe3176/manager/0.log" Dec 08 21:03:54 crc kubenswrapper[4781]: I1208 21:03:54.626170 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5f48db4cb9-gl7xk_e3c328d0-5fc0-4900-b90b-0b89bf486395/operator/0.log" Dec 08 21:03:54 crc kubenswrapper[4781]: I1208 21:03:54.684803 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-924xv_fa455afd-33f9-4f97-9eb8-838444176453/registry-server/0.log" Dec 08 21:03:55 crc kubenswrapper[4781]: I1208 21:03:55.022576 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-dnh9j_b5739409-2e10-4acf-8088-99608fc2f489/kube-rbac-proxy/0.log" Dec 08 21:03:55 crc kubenswrapper[4781]: I1208 21:03:55.067012 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-rdrrr_b86b0a68-3152-46ac-8bde-3bfd32c6fbf2/kube-rbac-proxy/0.log" Dec 08 21:03:55 crc kubenswrapper[4781]: I1208 21:03:55.152604 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-dnh9j_b5739409-2e10-4acf-8088-99608fc2f489/manager/0.log" Dec 08 21:03:55 crc kubenswrapper[4781]: I1208 21:03:55.444018 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-rdrrr_b86b0a68-3152-46ac-8bde-3bfd32c6fbf2/manager/0.log" Dec 08 21:03:55 crc kubenswrapper[4781]: I1208 21:03:55.445584 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-p26p9_60bdaea6-28dd-4dac-b1b8-a046ea0c90e0/operator/0.log" Dec 08 21:03:55 crc kubenswrapper[4781]: I1208 21:03:55.604792 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-24k5k_9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e/kube-rbac-proxy/0.log" Dec 08 21:03:55 crc kubenswrapper[4781]: I1208 21:03:55.666805 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-24k5k_9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e/manager/0.log" Dec 08 21:03:55 crc kubenswrapper[4781]: I1208 21:03:55.730344 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b47684954-hcdhf_0f03e65d-f0b8-4cfa-90bd-4a70de607c2d/manager/0.log" Dec 08 21:03:55 crc kubenswrapper[4781]: I1208 21:03:55.838155 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-j5vws_7f697324-2ce1-4c81-89c8-9cc53bac7062/kube-rbac-proxy/0.log" Dec 08 21:03:55 crc kubenswrapper[4781]: I1208 21:03:55.931174 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-j5vws_7f697324-2ce1-4c81-89c8-9cc53bac7062/manager/0.log" Dec 08 21:03:55 crc kubenswrapper[4781]: I1208 21:03:55.954588 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-2b4gp_908c60a3-663f-4a4a-9223-fbcff50de2b9/kube-rbac-proxy/0.log" Dec 08 21:03:56 crc kubenswrapper[4781]: I1208 21:03:56.010050 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-2b4gp_908c60a3-663f-4a4a-9223-fbcff50de2b9/manager/0.log" Dec 08 21:03:56 crc kubenswrapper[4781]: I1208 21:03:56.146548 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-fcn46_db28fdab-dab1-4d5c-9447-c895523b0985/kube-rbac-proxy/0.log" Dec 08 21:03:56 crc kubenswrapper[4781]: I1208 21:03:56.166675 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-fcn46_db28fdab-dab1-4d5c-9447-c895523b0985/manager/0.log" Dec 08 21:04:16 crc kubenswrapper[4781]: I1208 21:04:16.221855 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-29wgh_a9fba569-4fe5-44f1-905c-c0e844f64fca/control-plane-machine-set-operator/0.log" Dec 08 21:04:16 crc kubenswrapper[4781]: I1208 21:04:16.442141 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lx5mt_513ea4da-405c-4176-adbd-c8e5f68c631c/kube-rbac-proxy/0.log" Dec 08 21:04:16 crc kubenswrapper[4781]: I1208 21:04:16.475722 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lx5mt_513ea4da-405c-4176-adbd-c8e5f68c631c/machine-api-operator/0.log" Dec 08 21:04:28 crc kubenswrapper[4781]: I1208 21:04:28.521242 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-nfw2s_b83e9dad-39a3-4cce-aaaf-4f3ecc8a9e2d/cert-manager-controller/0.log" Dec 08 21:04:28 crc kubenswrapper[4781]: I1208 21:04:28.634462 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-t8kb7_579553a7-31d3-4b32-98fd-03e631e208d4/cert-manager-cainjector/0.log" Dec 08 21:04:28 crc kubenswrapper[4781]: I1208 21:04:28.699523 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-cpdhc_a8d7f2da-3db7-4daf-afd5-4d3984932e2d/cert-manager-webhook/0.log" Dec 08 21:04:41 crc kubenswrapper[4781]: I1208 21:04:41.861917 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-bsh4c_26857532-cde7-49e1-924f-eda2b362b6b7/nmstate-console-plugin/0.log" Dec 08 21:04:42 crc kubenswrapper[4781]: I1208 21:04:42.019849 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-777jl_74d0c70c-fc95-4b78-855f-96eb01f08c07/nmstate-handler/0.log" Dec 08 21:04:42 crc kubenswrapper[4781]: I1208 21:04:42.093753 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qjn2k_ff52b8c4-e459-42ed-96a1-dead7cdcd8b9/nmstate-metrics/0.log" Dec 08 21:04:42 crc kubenswrapper[4781]: I1208 21:04:42.100563 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qjn2k_ff52b8c4-e459-42ed-96a1-dead7cdcd8b9/kube-rbac-proxy/0.log" Dec 08 21:04:42 crc kubenswrapper[4781]: I1208 21:04:42.271152 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-66p75_0e3edf9d-cddf-4b23-bda2-930fb5cbaf27/nmstate-operator/0.log" Dec 08 21:04:42 crc kubenswrapper[4781]: I1208 21:04:42.312567 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-2m6pl_f547b46e-1a15-4d7c-a3c6-0167927eb75c/nmstate-webhook/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.108384 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-k4scp_ee2c128a-e154-4685-957c-1cd19b86f113/kube-rbac-proxy/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.222517 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-k4scp_ee2c128a-e154-4685-957c-1cd19b86f113/controller/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.314831 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-frr-files/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.467106 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-metrics/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.488354 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-frr-files/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.516265 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-reloader/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.538944 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-reloader/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.707658 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-reloader/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.729864 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-metrics/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.748735 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-frr-files/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.765797 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-metrics/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.891698 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-frr-files/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.910547 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-reloader/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.923838 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/controller/0.log" Dec 08 21:04:57 crc kubenswrapper[4781]: I1208 21:04:57.949980 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-metrics/0.log" Dec 08 21:04:58 crc kubenswrapper[4781]: I1208 21:04:58.140087 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/frr-metrics/0.log" Dec 08 21:04:58 crc kubenswrapper[4781]: I1208 21:04:58.167340 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/kube-rbac-proxy/0.log" Dec 08 21:04:58 crc kubenswrapper[4781]: I1208 21:04:58.172194 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/kube-rbac-proxy-frr/0.log" Dec 08 21:04:58 crc kubenswrapper[4781]: I1208 21:04:58.315751 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/reloader/0.log" Dec 08 21:04:58 crc kubenswrapper[4781]: I1208 21:04:58.412023 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-lwsps_17dc730c-3861-434c-aeed-de6287a1c55b/frr-k8s-webhook-server/0.log" Dec 08 21:04:58 crc kubenswrapper[4781]: I1208 21:04:58.609069 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-79f7dffd6f-8vls5_dd0edabf-1168-40aa-b197-c87637432272/manager/0.log" Dec 08 21:04:58 crc kubenswrapper[4781]: I1208 21:04:58.800662 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-dff497d76-fmghx_f195b37d-be68-434b-8295-23a8208108b8/webhook-server/0.log" Dec 08 21:04:58 crc kubenswrapper[4781]: I1208 21:04:58.885257 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b4s7x_95f89c56-40b6-4d4b-8060-154674b55a18/kube-rbac-proxy/0.log" Dec 08 21:04:59 crc kubenswrapper[4781]: I1208 21:04:59.412343 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b4s7x_95f89c56-40b6-4d4b-8060-154674b55a18/speaker/0.log" Dec 08 21:04:59 crc kubenswrapper[4781]: I1208 21:04:59.478292 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/frr/0.log" Dec 08 21:04:59 crc kubenswrapper[4781]: I1208 21:04:59.948539 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 21:04:59 crc kubenswrapper[4781]: I1208 21:04:59.948637 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 21:05:11 crc kubenswrapper[4781]: I1208 21:05:11.164987 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/util/0.log" Dec 08 21:05:11 crc kubenswrapper[4781]: I1208 21:05:11.281035 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/util/0.log" Dec 08 21:05:11 crc kubenswrapper[4781]: I1208 21:05:11.316280 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/pull/0.log" Dec 08 21:05:11 crc kubenswrapper[4781]: I1208 21:05:11.333682 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/pull/0.log" Dec 08 21:05:11 crc kubenswrapper[4781]: I1208 21:05:11.466537 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/util/0.log" Dec 08 21:05:11 crc kubenswrapper[4781]: I1208 21:05:11.491973 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/pull/0.log" Dec 08 21:05:11 crc kubenswrapper[4781]: I1208 21:05:11.500497 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/extract/0.log" Dec 08 21:05:11 crc kubenswrapper[4781]: I1208 21:05:11.651134 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/util/0.log" Dec 08 21:05:11 crc kubenswrapper[4781]: I1208 21:05:11.858098 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/pull/0.log" Dec 08 21:05:11 crc kubenswrapper[4781]: I1208 21:05:11.858131 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/pull/0.log" Dec 08 21:05:11 crc kubenswrapper[4781]: I1208 21:05:11.872628 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/util/0.log" Dec 08 21:05:12 crc kubenswrapper[4781]: I1208 21:05:12.020565 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/util/0.log" Dec 08 21:05:12 crc kubenswrapper[4781]: I1208 21:05:12.032519 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/extract/0.log" Dec 08 21:05:12 crc kubenswrapper[4781]: I1208 21:05:12.058723 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/pull/0.log" Dec 08 21:05:12 crc kubenswrapper[4781]: I1208 21:05:12.332927 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/extract-utilities/0.log" Dec 08 21:05:12 crc kubenswrapper[4781]: I1208 21:05:12.527217 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/extract-utilities/0.log" Dec 08 21:05:12 crc kubenswrapper[4781]: I1208 21:05:12.598100 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/extract-content/0.log" Dec 08 21:05:12 crc kubenswrapper[4781]: I1208 21:05:12.602963 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/extract-content/0.log" Dec 08 21:05:12 crc kubenswrapper[4781]: I1208 21:05:12.772264 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/extract-utilities/0.log" Dec 08 21:05:12 crc kubenswrapper[4781]: I1208 21:05:12.789321 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/extract-content/0.log" Dec 08 21:05:12 crc kubenswrapper[4781]: I1208 21:05:12.946029 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/extract-utilities/0.log" Dec 08 21:05:13 crc kubenswrapper[4781]: I1208 21:05:13.185667 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/extract-utilities/0.log" Dec 08 21:05:13 crc kubenswrapper[4781]: I1208 21:05:13.203411 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/extract-content/0.log" Dec 08 21:05:13 crc kubenswrapper[4781]: I1208 21:05:13.223481 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/extract-content/0.log" Dec 08 21:05:13 crc kubenswrapper[4781]: I1208 21:05:13.312561 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/registry-server/0.log" Dec 08 21:05:13 crc kubenswrapper[4781]: I1208 21:05:13.387058 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/extract-utilities/0.log" Dec 08 21:05:13 crc kubenswrapper[4781]: I1208 21:05:13.393442 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/extract-content/0.log" Dec 08 21:05:13 crc kubenswrapper[4781]: I1208 21:05:13.631531 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-w8dwv_81d63467-e009-4c3d-8391-9f034f2da751/marketplace-operator/0.log" Dec 08 21:05:13 crc kubenswrapper[4781]: I1208 21:05:13.748483 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/extract-utilities/0.log" Dec 08 21:05:13 crc kubenswrapper[4781]: I1208 21:05:13.911680 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/registry-server/0.log" Dec 08 21:05:13 crc kubenswrapper[4781]: I1208 21:05:13.981500 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/extract-utilities/0.log" Dec 08 21:05:14 crc kubenswrapper[4781]: I1208 21:05:14.014841 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/extract-content/0.log" Dec 08 21:05:14 crc kubenswrapper[4781]: I1208 21:05:14.024413 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/extract-content/0.log" Dec 08 21:05:14 crc kubenswrapper[4781]: I1208 21:05:14.161233 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/extract-utilities/0.log" Dec 08 21:05:14 crc kubenswrapper[4781]: I1208 21:05:14.234721 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/extract-content/0.log" Dec 08 21:05:14 crc kubenswrapper[4781]: I1208 21:05:14.356638 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/registry-server/0.log" Dec 08 21:05:14 crc kubenswrapper[4781]: I1208 21:05:14.412620 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/extract-utilities/0.log" Dec 08 21:05:14 crc kubenswrapper[4781]: I1208 21:05:14.557470 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/extract-utilities/0.log" Dec 08 21:05:14 crc kubenswrapper[4781]: I1208 21:05:14.558500 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/extract-content/0.log" Dec 08 21:05:14 crc kubenswrapper[4781]: I1208 21:05:14.569714 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/extract-content/0.log" Dec 08 21:05:14 crc kubenswrapper[4781]: I1208 21:05:14.745216 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/extract-utilities/0.log" Dec 08 21:05:14 crc kubenswrapper[4781]: I1208 21:05:14.751590 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/extract-content/0.log" Dec 08 21:05:15 crc kubenswrapper[4781]: I1208 21:05:15.314266 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/registry-server/0.log" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.307871 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fk9jq"] Dec 08 21:05:26 crc kubenswrapper[4781]: E1208 21:05:26.308675 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6555b8b-f509-4d68-839f-76927f356027" containerName="container-00" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.308687 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6555b8b-f509-4d68-839f-76927f356027" containerName="container-00" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.308905 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6555b8b-f509-4d68-839f-76927f356027" containerName="container-00" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.310553 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.324313 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fk9jq"] Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.384550 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10094b26-5b22-4c15-a48d-3dd357eb73d3-utilities\") pod \"redhat-operators-fk9jq\" (UID: \"10094b26-5b22-4c15-a48d-3dd357eb73d3\") " pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.384618 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10094b26-5b22-4c15-a48d-3dd357eb73d3-catalog-content\") pod \"redhat-operators-fk9jq\" (UID: \"10094b26-5b22-4c15-a48d-3dd357eb73d3\") " pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.384711 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5nlz\" (UniqueName: \"kubernetes.io/projected/10094b26-5b22-4c15-a48d-3dd357eb73d3-kube-api-access-p5nlz\") pod \"redhat-operators-fk9jq\" (UID: \"10094b26-5b22-4c15-a48d-3dd357eb73d3\") " pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.486459 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10094b26-5b22-4c15-a48d-3dd357eb73d3-utilities\") pod \"redhat-operators-fk9jq\" (UID: \"10094b26-5b22-4c15-a48d-3dd357eb73d3\") " pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.486809 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10094b26-5b22-4c15-a48d-3dd357eb73d3-catalog-content\") pod \"redhat-operators-fk9jq\" (UID: \"10094b26-5b22-4c15-a48d-3dd357eb73d3\") " pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.486942 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5nlz\" (UniqueName: \"kubernetes.io/projected/10094b26-5b22-4c15-a48d-3dd357eb73d3-kube-api-access-p5nlz\") pod \"redhat-operators-fk9jq\" (UID: \"10094b26-5b22-4c15-a48d-3dd357eb73d3\") " pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.487034 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10094b26-5b22-4c15-a48d-3dd357eb73d3-utilities\") pod \"redhat-operators-fk9jq\" (UID: \"10094b26-5b22-4c15-a48d-3dd357eb73d3\") " pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.487326 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10094b26-5b22-4c15-a48d-3dd357eb73d3-catalog-content\") pod \"redhat-operators-fk9jq\" (UID: \"10094b26-5b22-4c15-a48d-3dd357eb73d3\") " pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.519206 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5nlz\" (UniqueName: \"kubernetes.io/projected/10094b26-5b22-4c15-a48d-3dd357eb73d3-kube-api-access-p5nlz\") pod \"redhat-operators-fk9jq\" (UID: \"10094b26-5b22-4c15-a48d-3dd357eb73d3\") " pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:26 crc kubenswrapper[4781]: I1208 21:05:26.634216 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:27 crc kubenswrapper[4781]: I1208 21:05:27.119142 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fk9jq"] Dec 08 21:05:27 crc kubenswrapper[4781]: I1208 21:05:27.914224 4781 generic.go:334] "Generic (PLEG): container finished" podID="10094b26-5b22-4c15-a48d-3dd357eb73d3" containerID="d6b024fb2e582deb936a5045c8336b7d06a24f66538a682e8dbb447af2e12e0c" exitCode=0 Dec 08 21:05:27 crc kubenswrapper[4781]: I1208 21:05:27.914268 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk9jq" event={"ID":"10094b26-5b22-4c15-a48d-3dd357eb73d3","Type":"ContainerDied","Data":"d6b024fb2e582deb936a5045c8336b7d06a24f66538a682e8dbb447af2e12e0c"} Dec 08 21:05:27 crc kubenswrapper[4781]: I1208 21:05:27.914529 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk9jq" event={"ID":"10094b26-5b22-4c15-a48d-3dd357eb73d3","Type":"ContainerStarted","Data":"ff22e55349e7edd99063c16f825fb36bceb5aaa12a21a72aa08d04a69544940e"} Dec 08 21:05:27 crc kubenswrapper[4781]: I1208 21:05:27.917431 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 21:05:28 crc kubenswrapper[4781]: I1208 21:05:28.926803 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk9jq" event={"ID":"10094b26-5b22-4c15-a48d-3dd357eb73d3","Type":"ContainerStarted","Data":"afe1bc649f735bbb11c5d3349defca1f04a19df6e389c3562f136ae333e5db93"} Dec 08 21:05:29 crc kubenswrapper[4781]: I1208 21:05:29.947834 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 21:05:29 crc kubenswrapper[4781]: I1208 21:05:29.947910 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 21:05:31 crc kubenswrapper[4781]: I1208 21:05:31.956212 4781 generic.go:334] "Generic (PLEG): container finished" podID="10094b26-5b22-4c15-a48d-3dd357eb73d3" containerID="afe1bc649f735bbb11c5d3349defca1f04a19df6e389c3562f136ae333e5db93" exitCode=0 Dec 08 21:05:31 crc kubenswrapper[4781]: I1208 21:05:31.956426 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk9jq" event={"ID":"10094b26-5b22-4c15-a48d-3dd357eb73d3","Type":"ContainerDied","Data":"afe1bc649f735bbb11c5d3349defca1f04a19df6e389c3562f136ae333e5db93"} Dec 08 21:05:32 crc kubenswrapper[4781]: E1208 21:05:32.105509 4781 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.159:51354->38.102.83.159:36535: write tcp 38.102.83.159:51354->38.102.83.159:36535: write: broken pipe Dec 08 21:05:32 crc kubenswrapper[4781]: I1208 21:05:32.969699 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk9jq" event={"ID":"10094b26-5b22-4c15-a48d-3dd357eb73d3","Type":"ContainerStarted","Data":"9ecfaacd8dddaa86ebe26550519684513bcf5e231ec13c76e8142c6634ed8b76"} Dec 08 21:05:32 crc kubenswrapper[4781]: I1208 21:05:32.993343 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fk9jq" podStartSLOduration=2.21010851 podStartE2EDuration="6.993327708s" podCreationTimestamp="2025-12-08 21:05:26 +0000 UTC" firstStartedPulling="2025-12-08 21:05:27.917133509 +0000 UTC m=+3644.068416886" lastFinishedPulling="2025-12-08 21:05:32.700352707 +0000 UTC m=+3648.851636084" observedRunningTime="2025-12-08 21:05:32.989485298 +0000 UTC m=+3649.140768685" watchObservedRunningTime="2025-12-08 21:05:32.993327708 +0000 UTC m=+3649.144611085" Dec 08 21:05:36 crc kubenswrapper[4781]: I1208 21:05:36.634562 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:36 crc kubenswrapper[4781]: I1208 21:05:36.635035 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:37 crc kubenswrapper[4781]: I1208 21:05:37.700607 4781 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fk9jq" podUID="10094b26-5b22-4c15-a48d-3dd357eb73d3" containerName="registry-server" probeResult="failure" output=< Dec 08 21:05:37 crc kubenswrapper[4781]: timeout: failed to connect service ":50051" within 1s Dec 08 21:05:37 crc kubenswrapper[4781]: > Dec 08 21:05:46 crc kubenswrapper[4781]: I1208 21:05:46.689837 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:46 crc kubenswrapper[4781]: I1208 21:05:46.748000 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:46 crc kubenswrapper[4781]: I1208 21:05:46.930706 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fk9jq"] Dec 08 21:05:48 crc kubenswrapper[4781]: I1208 21:05:48.252361 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fk9jq" podUID="10094b26-5b22-4c15-a48d-3dd357eb73d3" containerName="registry-server" containerID="cri-o://9ecfaacd8dddaa86ebe26550519684513bcf5e231ec13c76e8142c6634ed8b76" gracePeriod=2 Dec 08 21:05:48 crc kubenswrapper[4781]: I1208 21:05:48.774443 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:48 crc kubenswrapper[4781]: I1208 21:05:48.822653 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10094b26-5b22-4c15-a48d-3dd357eb73d3-utilities\") pod \"10094b26-5b22-4c15-a48d-3dd357eb73d3\" (UID: \"10094b26-5b22-4c15-a48d-3dd357eb73d3\") " Dec 08 21:05:48 crc kubenswrapper[4781]: I1208 21:05:48.822751 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10094b26-5b22-4c15-a48d-3dd357eb73d3-catalog-content\") pod \"10094b26-5b22-4c15-a48d-3dd357eb73d3\" (UID: \"10094b26-5b22-4c15-a48d-3dd357eb73d3\") " Dec 08 21:05:48 crc kubenswrapper[4781]: I1208 21:05:48.822821 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5nlz\" (UniqueName: \"kubernetes.io/projected/10094b26-5b22-4c15-a48d-3dd357eb73d3-kube-api-access-p5nlz\") pod \"10094b26-5b22-4c15-a48d-3dd357eb73d3\" (UID: \"10094b26-5b22-4c15-a48d-3dd357eb73d3\") " Dec 08 21:05:48 crc kubenswrapper[4781]: I1208 21:05:48.823809 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10094b26-5b22-4c15-a48d-3dd357eb73d3-utilities" (OuterVolumeSpecName: "utilities") pod "10094b26-5b22-4c15-a48d-3dd357eb73d3" (UID: "10094b26-5b22-4c15-a48d-3dd357eb73d3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:05:48 crc kubenswrapper[4781]: I1208 21:05:48.828170 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10094b26-5b22-4c15-a48d-3dd357eb73d3-kube-api-access-p5nlz" (OuterVolumeSpecName: "kube-api-access-p5nlz") pod "10094b26-5b22-4c15-a48d-3dd357eb73d3" (UID: "10094b26-5b22-4c15-a48d-3dd357eb73d3"). InnerVolumeSpecName "kube-api-access-p5nlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:05:48 crc kubenswrapper[4781]: I1208 21:05:48.926554 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10094b26-5b22-4c15-a48d-3dd357eb73d3-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 21:05:48 crc kubenswrapper[4781]: I1208 21:05:48.926595 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5nlz\" (UniqueName: \"kubernetes.io/projected/10094b26-5b22-4c15-a48d-3dd357eb73d3-kube-api-access-p5nlz\") on node \"crc\" DevicePath \"\"" Dec 08 21:05:48 crc kubenswrapper[4781]: I1208 21:05:48.944577 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10094b26-5b22-4c15-a48d-3dd357eb73d3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10094b26-5b22-4c15-a48d-3dd357eb73d3" (UID: "10094b26-5b22-4c15-a48d-3dd357eb73d3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.029139 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10094b26-5b22-4c15-a48d-3dd357eb73d3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.264821 4781 generic.go:334] "Generic (PLEG): container finished" podID="10094b26-5b22-4c15-a48d-3dd357eb73d3" containerID="9ecfaacd8dddaa86ebe26550519684513bcf5e231ec13c76e8142c6634ed8b76" exitCode=0 Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.264868 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk9jq" event={"ID":"10094b26-5b22-4c15-a48d-3dd357eb73d3","Type":"ContainerDied","Data":"9ecfaacd8dddaa86ebe26550519684513bcf5e231ec13c76e8142c6634ed8b76"} Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.264896 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk9jq" event={"ID":"10094b26-5b22-4c15-a48d-3dd357eb73d3","Type":"ContainerDied","Data":"ff22e55349e7edd99063c16f825fb36bceb5aaa12a21a72aa08d04a69544940e"} Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.264947 4781 scope.go:117] "RemoveContainer" containerID="9ecfaacd8dddaa86ebe26550519684513bcf5e231ec13c76e8142c6634ed8b76" Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.264970 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fk9jq" Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.313547 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fk9jq"] Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.315811 4781 scope.go:117] "RemoveContainer" containerID="afe1bc649f735bbb11c5d3349defca1f04a19df6e389c3562f136ae333e5db93" Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.324566 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fk9jq"] Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.346409 4781 scope.go:117] "RemoveContainer" containerID="d6b024fb2e582deb936a5045c8336b7d06a24f66538a682e8dbb447af2e12e0c" Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.385291 4781 scope.go:117] "RemoveContainer" containerID="9ecfaacd8dddaa86ebe26550519684513bcf5e231ec13c76e8142c6634ed8b76" Dec 08 21:05:49 crc kubenswrapper[4781]: E1208 21:05:49.386102 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ecfaacd8dddaa86ebe26550519684513bcf5e231ec13c76e8142c6634ed8b76\": container with ID starting with 9ecfaacd8dddaa86ebe26550519684513bcf5e231ec13c76e8142c6634ed8b76 not found: ID does not exist" containerID="9ecfaacd8dddaa86ebe26550519684513bcf5e231ec13c76e8142c6634ed8b76" Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.386170 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ecfaacd8dddaa86ebe26550519684513bcf5e231ec13c76e8142c6634ed8b76"} err="failed to get container status \"9ecfaacd8dddaa86ebe26550519684513bcf5e231ec13c76e8142c6634ed8b76\": rpc error: code = NotFound desc = could not find container \"9ecfaacd8dddaa86ebe26550519684513bcf5e231ec13c76e8142c6634ed8b76\": container with ID starting with 9ecfaacd8dddaa86ebe26550519684513bcf5e231ec13c76e8142c6634ed8b76 not found: ID does not exist" Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.386203 4781 scope.go:117] "RemoveContainer" containerID="afe1bc649f735bbb11c5d3349defca1f04a19df6e389c3562f136ae333e5db93" Dec 08 21:05:49 crc kubenswrapper[4781]: E1208 21:05:49.387000 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afe1bc649f735bbb11c5d3349defca1f04a19df6e389c3562f136ae333e5db93\": container with ID starting with afe1bc649f735bbb11c5d3349defca1f04a19df6e389c3562f136ae333e5db93 not found: ID does not exist" containerID="afe1bc649f735bbb11c5d3349defca1f04a19df6e389c3562f136ae333e5db93" Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.387057 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afe1bc649f735bbb11c5d3349defca1f04a19df6e389c3562f136ae333e5db93"} err="failed to get container status \"afe1bc649f735bbb11c5d3349defca1f04a19df6e389c3562f136ae333e5db93\": rpc error: code = NotFound desc = could not find container \"afe1bc649f735bbb11c5d3349defca1f04a19df6e389c3562f136ae333e5db93\": container with ID starting with afe1bc649f735bbb11c5d3349defca1f04a19df6e389c3562f136ae333e5db93 not found: ID does not exist" Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.387087 4781 scope.go:117] "RemoveContainer" containerID="d6b024fb2e582deb936a5045c8336b7d06a24f66538a682e8dbb447af2e12e0c" Dec 08 21:05:49 crc kubenswrapper[4781]: E1208 21:05:49.387681 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b024fb2e582deb936a5045c8336b7d06a24f66538a682e8dbb447af2e12e0c\": container with ID starting with d6b024fb2e582deb936a5045c8336b7d06a24f66538a682e8dbb447af2e12e0c not found: ID does not exist" containerID="d6b024fb2e582deb936a5045c8336b7d06a24f66538a682e8dbb447af2e12e0c" Dec 08 21:05:49 crc kubenswrapper[4781]: I1208 21:05:49.387726 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b024fb2e582deb936a5045c8336b7d06a24f66538a682e8dbb447af2e12e0c"} err="failed to get container status \"d6b024fb2e582deb936a5045c8336b7d06a24f66538a682e8dbb447af2e12e0c\": rpc error: code = NotFound desc = could not find container \"d6b024fb2e582deb936a5045c8336b7d06a24f66538a682e8dbb447af2e12e0c\": container with ID starting with d6b024fb2e582deb936a5045c8336b7d06a24f66538a682e8dbb447af2e12e0c not found: ID does not exist" Dec 08 21:05:50 crc kubenswrapper[4781]: I1208 21:05:50.135709 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10094b26-5b22-4c15-a48d-3dd357eb73d3" path="/var/lib/kubelet/pods/10094b26-5b22-4c15-a48d-3dd357eb73d3/volumes" Dec 08 21:05:59 crc kubenswrapper[4781]: I1208 21:05:59.948009 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 21:05:59 crc kubenswrapper[4781]: I1208 21:05:59.948502 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 21:05:59 crc kubenswrapper[4781]: I1208 21:05:59.948573 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 21:05:59 crc kubenswrapper[4781]: I1208 21:05:59.952551 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27ff1a6c014cb26a25650d45b4f247e2b1d66fd533143c98c5d1073d95951d01"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 21:05:59 crc kubenswrapper[4781]: I1208 21:05:59.952695 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://27ff1a6c014cb26a25650d45b4f247e2b1d66fd533143c98c5d1073d95951d01" gracePeriod=600 Dec 08 21:06:00 crc kubenswrapper[4781]: I1208 21:06:00.540659 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="27ff1a6c014cb26a25650d45b4f247e2b1d66fd533143c98c5d1073d95951d01" exitCode=0 Dec 08 21:06:00 crc kubenswrapper[4781]: I1208 21:06:00.540697 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"27ff1a6c014cb26a25650d45b4f247e2b1d66fd533143c98c5d1073d95951d01"} Dec 08 21:06:00 crc kubenswrapper[4781]: I1208 21:06:00.541072 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619"} Dec 08 21:06:00 crc kubenswrapper[4781]: I1208 21:06:00.541102 4781 scope.go:117] "RemoveContainer" containerID="be3438256d00c983ad1e1c1c72c8b29369cc3937d20bdf95bfd4b62482fb1469" Dec 08 21:06:54 crc kubenswrapper[4781]: I1208 21:06:54.135964 4781 generic.go:334] "Generic (PLEG): container finished" podID="febcd94c-89fc-40bc-8e6c-28540f33048d" containerID="f2ed68cde998eff3c986f091181418fbdec039a30accebad5e1c3b2d5b191ecb" exitCode=0 Dec 08 21:06:54 crc kubenswrapper[4781]: I1208 21:06:54.150570 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2k9gc/must-gather-8drg4" event={"ID":"febcd94c-89fc-40bc-8e6c-28540f33048d","Type":"ContainerDied","Data":"f2ed68cde998eff3c986f091181418fbdec039a30accebad5e1c3b2d5b191ecb"} Dec 08 21:06:54 crc kubenswrapper[4781]: I1208 21:06:54.151694 4781 scope.go:117] "RemoveContainer" containerID="f2ed68cde998eff3c986f091181418fbdec039a30accebad5e1c3b2d5b191ecb" Dec 08 21:06:54 crc kubenswrapper[4781]: I1208 21:06:54.858518 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2k9gc_must-gather-8drg4_febcd94c-89fc-40bc-8e6c-28540f33048d/gather/0.log" Dec 08 21:07:03 crc kubenswrapper[4781]: I1208 21:07:03.216760 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2k9gc/must-gather-8drg4"] Dec 08 21:07:03 crc kubenswrapper[4781]: I1208 21:07:03.217660 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-2k9gc/must-gather-8drg4" podUID="febcd94c-89fc-40bc-8e6c-28540f33048d" containerName="copy" containerID="cri-o://146bcfa39c40b383f56c41bfa480cdb44806504aa672ef41db98fb3b105f4b4c" gracePeriod=2 Dec 08 21:07:03 crc kubenswrapper[4781]: I1208 21:07:03.226225 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2k9gc/must-gather-8drg4"] Dec 08 21:07:03 crc kubenswrapper[4781]: I1208 21:07:03.771678 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2k9gc_must-gather-8drg4_febcd94c-89fc-40bc-8e6c-28540f33048d/copy/0.log" Dec 08 21:07:03 crc kubenswrapper[4781]: I1208 21:07:03.772504 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/must-gather-8drg4" Dec 08 21:07:03 crc kubenswrapper[4781]: I1208 21:07:03.862136 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/febcd94c-89fc-40bc-8e6c-28540f33048d-must-gather-output\") pod \"febcd94c-89fc-40bc-8e6c-28540f33048d\" (UID: \"febcd94c-89fc-40bc-8e6c-28540f33048d\") " Dec 08 21:07:03 crc kubenswrapper[4781]: I1208 21:07:03.862295 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc7q5\" (UniqueName: \"kubernetes.io/projected/febcd94c-89fc-40bc-8e6c-28540f33048d-kube-api-access-rc7q5\") pod \"febcd94c-89fc-40bc-8e6c-28540f33048d\" (UID: \"febcd94c-89fc-40bc-8e6c-28540f33048d\") " Dec 08 21:07:03 crc kubenswrapper[4781]: I1208 21:07:03.957773 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/febcd94c-89fc-40bc-8e6c-28540f33048d-kube-api-access-rc7q5" (OuterVolumeSpecName: "kube-api-access-rc7q5") pod "febcd94c-89fc-40bc-8e6c-28540f33048d" (UID: "febcd94c-89fc-40bc-8e6c-28540f33048d"). InnerVolumeSpecName "kube-api-access-rc7q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:07:03 crc kubenswrapper[4781]: I1208 21:07:03.964864 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc7q5\" (UniqueName: \"kubernetes.io/projected/febcd94c-89fc-40bc-8e6c-28540f33048d-kube-api-access-rc7q5\") on node \"crc\" DevicePath \"\"" Dec 08 21:07:04 crc kubenswrapper[4781]: I1208 21:07:04.011727 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febcd94c-89fc-40bc-8e6c-28540f33048d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "febcd94c-89fc-40bc-8e6c-28540f33048d" (UID: "febcd94c-89fc-40bc-8e6c-28540f33048d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:07:04 crc kubenswrapper[4781]: I1208 21:07:04.066292 4781 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/febcd94c-89fc-40bc-8e6c-28540f33048d-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 08 21:07:04 crc kubenswrapper[4781]: I1208 21:07:04.146512 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="febcd94c-89fc-40bc-8e6c-28540f33048d" path="/var/lib/kubelet/pods/febcd94c-89fc-40bc-8e6c-28540f33048d/volumes" Dec 08 21:07:04 crc kubenswrapper[4781]: I1208 21:07:04.258604 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2k9gc_must-gather-8drg4_febcd94c-89fc-40bc-8e6c-28540f33048d/copy/0.log" Dec 08 21:07:04 crc kubenswrapper[4781]: I1208 21:07:04.258986 4781 generic.go:334] "Generic (PLEG): container finished" podID="febcd94c-89fc-40bc-8e6c-28540f33048d" containerID="146bcfa39c40b383f56c41bfa480cdb44806504aa672ef41db98fb3b105f4b4c" exitCode=143 Dec 08 21:07:04 crc kubenswrapper[4781]: I1208 21:07:04.259035 4781 scope.go:117] "RemoveContainer" containerID="146bcfa39c40b383f56c41bfa480cdb44806504aa672ef41db98fb3b105f4b4c" Dec 08 21:07:04 crc kubenswrapper[4781]: I1208 21:07:04.259040 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2k9gc/must-gather-8drg4" Dec 08 21:07:04 crc kubenswrapper[4781]: I1208 21:07:04.285630 4781 scope.go:117] "RemoveContainer" containerID="f2ed68cde998eff3c986f091181418fbdec039a30accebad5e1c3b2d5b191ecb" Dec 08 21:07:04 crc kubenswrapper[4781]: I1208 21:07:04.375462 4781 scope.go:117] "RemoveContainer" containerID="146bcfa39c40b383f56c41bfa480cdb44806504aa672ef41db98fb3b105f4b4c" Dec 08 21:07:04 crc kubenswrapper[4781]: E1208 21:07:04.376114 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"146bcfa39c40b383f56c41bfa480cdb44806504aa672ef41db98fb3b105f4b4c\": container with ID starting with 146bcfa39c40b383f56c41bfa480cdb44806504aa672ef41db98fb3b105f4b4c not found: ID does not exist" containerID="146bcfa39c40b383f56c41bfa480cdb44806504aa672ef41db98fb3b105f4b4c" Dec 08 21:07:04 crc kubenswrapper[4781]: I1208 21:07:04.376144 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"146bcfa39c40b383f56c41bfa480cdb44806504aa672ef41db98fb3b105f4b4c"} err="failed to get container status \"146bcfa39c40b383f56c41bfa480cdb44806504aa672ef41db98fb3b105f4b4c\": rpc error: code = NotFound desc = could not find container \"146bcfa39c40b383f56c41bfa480cdb44806504aa672ef41db98fb3b105f4b4c\": container with ID starting with 146bcfa39c40b383f56c41bfa480cdb44806504aa672ef41db98fb3b105f4b4c not found: ID does not exist" Dec 08 21:07:04 crc kubenswrapper[4781]: I1208 21:07:04.376167 4781 scope.go:117] "RemoveContainer" containerID="f2ed68cde998eff3c986f091181418fbdec039a30accebad5e1c3b2d5b191ecb" Dec 08 21:07:04 crc kubenswrapper[4781]: E1208 21:07:04.376514 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2ed68cde998eff3c986f091181418fbdec039a30accebad5e1c3b2d5b191ecb\": container with ID starting with f2ed68cde998eff3c986f091181418fbdec039a30accebad5e1c3b2d5b191ecb not found: ID does not exist" containerID="f2ed68cde998eff3c986f091181418fbdec039a30accebad5e1c3b2d5b191ecb" Dec 08 21:07:04 crc kubenswrapper[4781]: I1208 21:07:04.376534 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2ed68cde998eff3c986f091181418fbdec039a30accebad5e1c3b2d5b191ecb"} err="failed to get container status \"f2ed68cde998eff3c986f091181418fbdec039a30accebad5e1c3b2d5b191ecb\": rpc error: code = NotFound desc = could not find container \"f2ed68cde998eff3c986f091181418fbdec039a30accebad5e1c3b2d5b191ecb\": container with ID starting with f2ed68cde998eff3c986f091181418fbdec039a30accebad5e1c3b2d5b191ecb not found: ID does not exist" Dec 08 21:08:29 crc kubenswrapper[4781]: I1208 21:08:29.947946 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 21:08:29 crc kubenswrapper[4781]: I1208 21:08:29.948527 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 21:08:59 crc kubenswrapper[4781]: I1208 21:08:59.948312 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 21:08:59 crc kubenswrapper[4781]: I1208 21:08:59.949258 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 21:09:29 crc kubenswrapper[4781]: I1208 21:09:29.948457 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 21:09:29 crc kubenswrapper[4781]: I1208 21:09:29.949329 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 21:09:29 crc kubenswrapper[4781]: I1208 21:09:29.949454 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 21:09:29 crc kubenswrapper[4781]: I1208 21:09:29.950629 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 21:09:29 crc kubenswrapper[4781]: I1208 21:09:29.950779 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" gracePeriod=600 Dec 08 21:09:30 crc kubenswrapper[4781]: E1208 21:09:30.087555 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:09:31 crc kubenswrapper[4781]: I1208 21:09:31.006982 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" exitCode=0 Dec 08 21:09:31 crc kubenswrapper[4781]: I1208 21:09:31.007104 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619"} Dec 08 21:09:31 crc kubenswrapper[4781]: I1208 21:09:31.007595 4781 scope.go:117] "RemoveContainer" containerID="27ff1a6c014cb26a25650d45b4f247e2b1d66fd533143c98c5d1073d95951d01" Dec 08 21:09:31 crc kubenswrapper[4781]: I1208 21:09:31.008769 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:09:31 crc kubenswrapper[4781]: E1208 21:09:31.011547 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:09:46 crc kubenswrapper[4781]: I1208 21:09:46.126773 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:09:46 crc kubenswrapper[4781]: E1208 21:09:46.127467 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:10:00 crc kubenswrapper[4781]: I1208 21:10:00.126876 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:10:00 crc kubenswrapper[4781]: E1208 21:10:00.127882 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.472951 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-455h5/must-gather-zk4ld"] Dec 08 21:10:03 crc kubenswrapper[4781]: E1208 21:10:03.473940 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febcd94c-89fc-40bc-8e6c-28540f33048d" containerName="copy" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.473954 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="febcd94c-89fc-40bc-8e6c-28540f33048d" containerName="copy" Dec 08 21:10:03 crc kubenswrapper[4781]: E1208 21:10:03.473969 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10094b26-5b22-4c15-a48d-3dd357eb73d3" containerName="registry-server" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.473975 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="10094b26-5b22-4c15-a48d-3dd357eb73d3" containerName="registry-server" Dec 08 21:10:03 crc kubenswrapper[4781]: E1208 21:10:03.474034 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10094b26-5b22-4c15-a48d-3dd357eb73d3" containerName="extract-content" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.474040 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="10094b26-5b22-4c15-a48d-3dd357eb73d3" containerName="extract-content" Dec 08 21:10:03 crc kubenswrapper[4781]: E1208 21:10:03.474056 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febcd94c-89fc-40bc-8e6c-28540f33048d" containerName="gather" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.474064 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="febcd94c-89fc-40bc-8e6c-28540f33048d" containerName="gather" Dec 08 21:10:03 crc kubenswrapper[4781]: E1208 21:10:03.474082 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10094b26-5b22-4c15-a48d-3dd357eb73d3" containerName="extract-utilities" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.474088 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="10094b26-5b22-4c15-a48d-3dd357eb73d3" containerName="extract-utilities" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.474277 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="febcd94c-89fc-40bc-8e6c-28540f33048d" containerName="gather" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.474289 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="10094b26-5b22-4c15-a48d-3dd357eb73d3" containerName="registry-server" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.474313 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="febcd94c-89fc-40bc-8e6c-28540f33048d" containerName="copy" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.475384 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/must-gather-zk4ld" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.478086 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-455h5"/"default-dockercfg-qprxk" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.478116 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-455h5"/"openshift-service-ca.crt" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.478092 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-455h5"/"kube-root-ca.crt" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.484028 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-455h5/must-gather-zk4ld"] Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.548085 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6xxf\" (UniqueName: \"kubernetes.io/projected/653dbb9a-6eca-4e85-8073-9cf00a7f1346-kube-api-access-r6xxf\") pod \"must-gather-zk4ld\" (UID: \"653dbb9a-6eca-4e85-8073-9cf00a7f1346\") " pod="openshift-must-gather-455h5/must-gather-zk4ld" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.548371 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/653dbb9a-6eca-4e85-8073-9cf00a7f1346-must-gather-output\") pod \"must-gather-zk4ld\" (UID: \"653dbb9a-6eca-4e85-8073-9cf00a7f1346\") " pod="openshift-must-gather-455h5/must-gather-zk4ld" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.650212 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6xxf\" (UniqueName: \"kubernetes.io/projected/653dbb9a-6eca-4e85-8073-9cf00a7f1346-kube-api-access-r6xxf\") pod \"must-gather-zk4ld\" (UID: \"653dbb9a-6eca-4e85-8073-9cf00a7f1346\") " pod="openshift-must-gather-455h5/must-gather-zk4ld" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.650534 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/653dbb9a-6eca-4e85-8073-9cf00a7f1346-must-gather-output\") pod \"must-gather-zk4ld\" (UID: \"653dbb9a-6eca-4e85-8073-9cf00a7f1346\") " pod="openshift-must-gather-455h5/must-gather-zk4ld" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.650955 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/653dbb9a-6eca-4e85-8073-9cf00a7f1346-must-gather-output\") pod \"must-gather-zk4ld\" (UID: \"653dbb9a-6eca-4e85-8073-9cf00a7f1346\") " pod="openshift-must-gather-455h5/must-gather-zk4ld" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.674873 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6xxf\" (UniqueName: \"kubernetes.io/projected/653dbb9a-6eca-4e85-8073-9cf00a7f1346-kube-api-access-r6xxf\") pod \"must-gather-zk4ld\" (UID: \"653dbb9a-6eca-4e85-8073-9cf00a7f1346\") " pod="openshift-must-gather-455h5/must-gather-zk4ld" Dec 08 21:10:03 crc kubenswrapper[4781]: I1208 21:10:03.792702 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/must-gather-zk4ld" Dec 08 21:10:04 crc kubenswrapper[4781]: I1208 21:10:04.268641 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-455h5/must-gather-zk4ld"] Dec 08 21:10:04 crc kubenswrapper[4781]: I1208 21:10:04.394288 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-455h5/must-gather-zk4ld" event={"ID":"653dbb9a-6eca-4e85-8073-9cf00a7f1346","Type":"ContainerStarted","Data":"08781bce2be15339509f554e31b184e01814cb8813921fc2df9df8008e62bbf0"} Dec 08 21:10:05 crc kubenswrapper[4781]: I1208 21:10:05.405973 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-455h5/must-gather-zk4ld" event={"ID":"653dbb9a-6eca-4e85-8073-9cf00a7f1346","Type":"ContainerStarted","Data":"41b2704a97035b881b64fa48a2d8d38152cd8893d9d99a9491416f9c51c8ffb6"} Dec 08 21:10:05 crc kubenswrapper[4781]: I1208 21:10:05.406303 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-455h5/must-gather-zk4ld" event={"ID":"653dbb9a-6eca-4e85-8073-9cf00a7f1346","Type":"ContainerStarted","Data":"411ffc3a29c1712b50a9566cb0f658db2cf68758e48a85a0e29e8480d8fbb1f6"} Dec 08 21:10:05 crc kubenswrapper[4781]: I1208 21:10:05.432351 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-455h5/must-gather-zk4ld" podStartSLOduration=2.432333404 podStartE2EDuration="2.432333404s" podCreationTimestamp="2025-12-08 21:10:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 21:10:05.425649213 +0000 UTC m=+3921.576932590" watchObservedRunningTime="2025-12-08 21:10:05.432333404 +0000 UTC m=+3921.583616781" Dec 08 21:10:08 crc kubenswrapper[4781]: I1208 21:10:08.102745 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-455h5/crc-debug-dclth"] Dec 08 21:10:08 crc kubenswrapper[4781]: I1208 21:10:08.104844 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/crc-debug-dclth" Dec 08 21:10:08 crc kubenswrapper[4781]: I1208 21:10:08.234054 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlffr\" (UniqueName: \"kubernetes.io/projected/96b2dd97-8174-435c-85d0-02c2d39239fd-kube-api-access-qlffr\") pod \"crc-debug-dclth\" (UID: \"96b2dd97-8174-435c-85d0-02c2d39239fd\") " pod="openshift-must-gather-455h5/crc-debug-dclth" Dec 08 21:10:08 crc kubenswrapper[4781]: I1208 21:10:08.234897 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b2dd97-8174-435c-85d0-02c2d39239fd-host\") pod \"crc-debug-dclth\" (UID: \"96b2dd97-8174-435c-85d0-02c2d39239fd\") " pod="openshift-must-gather-455h5/crc-debug-dclth" Dec 08 21:10:08 crc kubenswrapper[4781]: I1208 21:10:08.407800 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlffr\" (UniqueName: \"kubernetes.io/projected/96b2dd97-8174-435c-85d0-02c2d39239fd-kube-api-access-qlffr\") pod \"crc-debug-dclth\" (UID: \"96b2dd97-8174-435c-85d0-02c2d39239fd\") " pod="openshift-must-gather-455h5/crc-debug-dclth" Dec 08 21:10:08 crc kubenswrapper[4781]: I1208 21:10:08.407947 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b2dd97-8174-435c-85d0-02c2d39239fd-host\") pod \"crc-debug-dclth\" (UID: \"96b2dd97-8174-435c-85d0-02c2d39239fd\") " pod="openshift-must-gather-455h5/crc-debug-dclth" Dec 08 21:10:08 crc kubenswrapper[4781]: I1208 21:10:08.408074 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b2dd97-8174-435c-85d0-02c2d39239fd-host\") pod \"crc-debug-dclth\" (UID: \"96b2dd97-8174-435c-85d0-02c2d39239fd\") " pod="openshift-must-gather-455h5/crc-debug-dclth" Dec 08 21:10:08 crc kubenswrapper[4781]: I1208 21:10:08.429008 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlffr\" (UniqueName: \"kubernetes.io/projected/96b2dd97-8174-435c-85d0-02c2d39239fd-kube-api-access-qlffr\") pod \"crc-debug-dclth\" (UID: \"96b2dd97-8174-435c-85d0-02c2d39239fd\") " pod="openshift-must-gather-455h5/crc-debug-dclth" Dec 08 21:10:08 crc kubenswrapper[4781]: I1208 21:10:08.429479 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/crc-debug-dclth" Dec 08 21:10:08 crc kubenswrapper[4781]: W1208 21:10:08.467107 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96b2dd97_8174_435c_85d0_02c2d39239fd.slice/crio-41be928cd9944d0de9c8904a8c03c6f5ff67c18a6910021d1f5aa0f901cfb925 WatchSource:0}: Error finding container 41be928cd9944d0de9c8904a8c03c6f5ff67c18a6910021d1f5aa0f901cfb925: Status 404 returned error can't find the container with id 41be928cd9944d0de9c8904a8c03c6f5ff67c18a6910021d1f5aa0f901cfb925 Dec 08 21:10:09 crc kubenswrapper[4781]: I1208 21:10:09.439222 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-455h5/crc-debug-dclth" event={"ID":"96b2dd97-8174-435c-85d0-02c2d39239fd","Type":"ContainerStarted","Data":"e5f3756745fb4b475b8612b1169575c0dc68f5651ff2e5840c24683e978d473b"} Dec 08 21:10:09 crc kubenswrapper[4781]: I1208 21:10:09.439666 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-455h5/crc-debug-dclth" event={"ID":"96b2dd97-8174-435c-85d0-02c2d39239fd","Type":"ContainerStarted","Data":"41be928cd9944d0de9c8904a8c03c6f5ff67c18a6910021d1f5aa0f901cfb925"} Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.652125 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-455h5/crc-debug-dclth" podStartSLOduration=3.652093398 podStartE2EDuration="3.652093398s" podCreationTimestamp="2025-12-08 21:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 21:10:09.464877821 +0000 UTC m=+3925.616161198" watchObservedRunningTime="2025-12-08 21:10:11.652093398 +0000 UTC m=+3927.803376785" Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.659886 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4ts28"] Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.661974 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.673566 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ts28"] Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.793806 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1a78d36-3704-481d-aff1-7c14854705f9-catalog-content\") pod \"redhat-marketplace-4ts28\" (UID: \"c1a78d36-3704-481d-aff1-7c14854705f9\") " pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.793863 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1a78d36-3704-481d-aff1-7c14854705f9-utilities\") pod \"redhat-marketplace-4ts28\" (UID: \"c1a78d36-3704-481d-aff1-7c14854705f9\") " pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.793903 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-564t4\" (UniqueName: \"kubernetes.io/projected/c1a78d36-3704-481d-aff1-7c14854705f9-kube-api-access-564t4\") pod \"redhat-marketplace-4ts28\" (UID: \"c1a78d36-3704-481d-aff1-7c14854705f9\") " pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.895708 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-564t4\" (UniqueName: \"kubernetes.io/projected/c1a78d36-3704-481d-aff1-7c14854705f9-kube-api-access-564t4\") pod \"redhat-marketplace-4ts28\" (UID: \"c1a78d36-3704-481d-aff1-7c14854705f9\") " pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.895872 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1a78d36-3704-481d-aff1-7c14854705f9-catalog-content\") pod \"redhat-marketplace-4ts28\" (UID: \"c1a78d36-3704-481d-aff1-7c14854705f9\") " pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.895904 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1a78d36-3704-481d-aff1-7c14854705f9-utilities\") pod \"redhat-marketplace-4ts28\" (UID: \"c1a78d36-3704-481d-aff1-7c14854705f9\") " pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.896364 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1a78d36-3704-481d-aff1-7c14854705f9-catalog-content\") pod \"redhat-marketplace-4ts28\" (UID: \"c1a78d36-3704-481d-aff1-7c14854705f9\") " pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.896434 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1a78d36-3704-481d-aff1-7c14854705f9-utilities\") pod \"redhat-marketplace-4ts28\" (UID: \"c1a78d36-3704-481d-aff1-7c14854705f9\") " pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.917114 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-564t4\" (UniqueName: \"kubernetes.io/projected/c1a78d36-3704-481d-aff1-7c14854705f9-kube-api-access-564t4\") pod \"redhat-marketplace-4ts28\" (UID: \"c1a78d36-3704-481d-aff1-7c14854705f9\") " pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:11 crc kubenswrapper[4781]: I1208 21:10:11.985154 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:12 crc kubenswrapper[4781]: I1208 21:10:12.534571 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ts28"] Dec 08 21:10:13 crc kubenswrapper[4781]: I1208 21:10:13.526446 4781 generic.go:334] "Generic (PLEG): container finished" podID="c1a78d36-3704-481d-aff1-7c14854705f9" containerID="de0aa83686bebb5b52d3239e874e07b3c05eb797021492662bd7caa4359b4ca5" exitCode=0 Dec 08 21:10:13 crc kubenswrapper[4781]: I1208 21:10:13.526548 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ts28" event={"ID":"c1a78d36-3704-481d-aff1-7c14854705f9","Type":"ContainerDied","Data":"de0aa83686bebb5b52d3239e874e07b3c05eb797021492662bd7caa4359b4ca5"} Dec 08 21:10:13 crc kubenswrapper[4781]: I1208 21:10:13.526713 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ts28" event={"ID":"c1a78d36-3704-481d-aff1-7c14854705f9","Type":"ContainerStarted","Data":"697bbce766ac17668ecb3eb78e8bcaa1389e893140c208d44dca1c04fa4d6690"} Dec 08 21:10:14 crc kubenswrapper[4781]: I1208 21:10:14.135656 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:10:14 crc kubenswrapper[4781]: E1208 21:10:14.135984 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:10:14 crc kubenswrapper[4781]: I1208 21:10:14.569997 4781 generic.go:334] "Generic (PLEG): container finished" podID="c1a78d36-3704-481d-aff1-7c14854705f9" containerID="a2249c4808b6f899fa26b9e959273772b8cbaffed17df1a2ae79f858196599ac" exitCode=0 Dec 08 21:10:14 crc kubenswrapper[4781]: I1208 21:10:14.571138 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ts28" event={"ID":"c1a78d36-3704-481d-aff1-7c14854705f9","Type":"ContainerDied","Data":"a2249c4808b6f899fa26b9e959273772b8cbaffed17df1a2ae79f858196599ac"} Dec 08 21:10:15 crc kubenswrapper[4781]: I1208 21:10:15.582354 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ts28" event={"ID":"c1a78d36-3704-481d-aff1-7c14854705f9","Type":"ContainerStarted","Data":"bb750adb19f83315dfd697e9a47bdf7db5f449f6b80a681d0b1469d65b1aba3a"} Dec 08 21:10:15 crc kubenswrapper[4781]: I1208 21:10:15.608961 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4ts28" podStartSLOduration=3.078742005 podStartE2EDuration="4.60891005s" podCreationTimestamp="2025-12-08 21:10:11 +0000 UTC" firstStartedPulling="2025-12-08 21:10:13.534092726 +0000 UTC m=+3929.685376103" lastFinishedPulling="2025-12-08 21:10:15.064260771 +0000 UTC m=+3931.215544148" observedRunningTime="2025-12-08 21:10:15.597503973 +0000 UTC m=+3931.748787360" watchObservedRunningTime="2025-12-08 21:10:15.60891005 +0000 UTC m=+3931.760193427" Dec 08 21:10:21 crc kubenswrapper[4781]: I1208 21:10:21.986714 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:21 crc kubenswrapper[4781]: I1208 21:10:21.987311 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:22 crc kubenswrapper[4781]: I1208 21:10:22.030408 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:22 crc kubenswrapper[4781]: I1208 21:10:22.799009 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:22 crc kubenswrapper[4781]: I1208 21:10:22.855774 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ts28"] Dec 08 21:10:24 crc kubenswrapper[4781]: I1208 21:10:24.758283 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4ts28" podUID="c1a78d36-3704-481d-aff1-7c14854705f9" containerName="registry-server" containerID="cri-o://bb750adb19f83315dfd697e9a47bdf7db5f449f6b80a681d0b1469d65b1aba3a" gracePeriod=2 Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.327558 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.452522 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-564t4\" (UniqueName: \"kubernetes.io/projected/c1a78d36-3704-481d-aff1-7c14854705f9-kube-api-access-564t4\") pod \"c1a78d36-3704-481d-aff1-7c14854705f9\" (UID: \"c1a78d36-3704-481d-aff1-7c14854705f9\") " Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.453181 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1a78d36-3704-481d-aff1-7c14854705f9-catalog-content\") pod \"c1a78d36-3704-481d-aff1-7c14854705f9\" (UID: \"c1a78d36-3704-481d-aff1-7c14854705f9\") " Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.453440 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1a78d36-3704-481d-aff1-7c14854705f9-utilities\") pod \"c1a78d36-3704-481d-aff1-7c14854705f9\" (UID: \"c1a78d36-3704-481d-aff1-7c14854705f9\") " Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.454906 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1a78d36-3704-481d-aff1-7c14854705f9-utilities" (OuterVolumeSpecName: "utilities") pod "c1a78d36-3704-481d-aff1-7c14854705f9" (UID: "c1a78d36-3704-481d-aff1-7c14854705f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.459532 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a78d36-3704-481d-aff1-7c14854705f9-kube-api-access-564t4" (OuterVolumeSpecName: "kube-api-access-564t4") pod "c1a78d36-3704-481d-aff1-7c14854705f9" (UID: "c1a78d36-3704-481d-aff1-7c14854705f9"). InnerVolumeSpecName "kube-api-access-564t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.480210 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1a78d36-3704-481d-aff1-7c14854705f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1a78d36-3704-481d-aff1-7c14854705f9" (UID: "c1a78d36-3704-481d-aff1-7c14854705f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.555749 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-564t4\" (UniqueName: \"kubernetes.io/projected/c1a78d36-3704-481d-aff1-7c14854705f9-kube-api-access-564t4\") on node \"crc\" DevicePath \"\"" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.556018 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1a78d36-3704-481d-aff1-7c14854705f9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.556111 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1a78d36-3704-481d-aff1-7c14854705f9-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.769571 4781 generic.go:334] "Generic (PLEG): container finished" podID="c1a78d36-3704-481d-aff1-7c14854705f9" containerID="bb750adb19f83315dfd697e9a47bdf7db5f449f6b80a681d0b1469d65b1aba3a" exitCode=0 Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.769777 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ts28" event={"ID":"c1a78d36-3704-481d-aff1-7c14854705f9","Type":"ContainerDied","Data":"bb750adb19f83315dfd697e9a47bdf7db5f449f6b80a681d0b1469d65b1aba3a"} Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.770682 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ts28" event={"ID":"c1a78d36-3704-481d-aff1-7c14854705f9","Type":"ContainerDied","Data":"697bbce766ac17668ecb3eb78e8bcaa1389e893140c208d44dca1c04fa4d6690"} Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.770767 4781 scope.go:117] "RemoveContainer" containerID="bb750adb19f83315dfd697e9a47bdf7db5f449f6b80a681d0b1469d65b1aba3a" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.769851 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ts28" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.794130 4781 scope.go:117] "RemoveContainer" containerID="a2249c4808b6f899fa26b9e959273772b8cbaffed17df1a2ae79f858196599ac" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.814129 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ts28"] Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.824473 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ts28"] Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.833220 4781 scope.go:117] "RemoveContainer" containerID="de0aa83686bebb5b52d3239e874e07b3c05eb797021492662bd7caa4359b4ca5" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.863831 4781 scope.go:117] "RemoveContainer" containerID="bb750adb19f83315dfd697e9a47bdf7db5f449f6b80a681d0b1469d65b1aba3a" Dec 08 21:10:25 crc kubenswrapper[4781]: E1208 21:10:25.865003 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb750adb19f83315dfd697e9a47bdf7db5f449f6b80a681d0b1469d65b1aba3a\": container with ID starting with bb750adb19f83315dfd697e9a47bdf7db5f449f6b80a681d0b1469d65b1aba3a not found: ID does not exist" containerID="bb750adb19f83315dfd697e9a47bdf7db5f449f6b80a681d0b1469d65b1aba3a" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.865050 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb750adb19f83315dfd697e9a47bdf7db5f449f6b80a681d0b1469d65b1aba3a"} err="failed to get container status \"bb750adb19f83315dfd697e9a47bdf7db5f449f6b80a681d0b1469d65b1aba3a\": rpc error: code = NotFound desc = could not find container \"bb750adb19f83315dfd697e9a47bdf7db5f449f6b80a681d0b1469d65b1aba3a\": container with ID starting with bb750adb19f83315dfd697e9a47bdf7db5f449f6b80a681d0b1469d65b1aba3a not found: ID does not exist" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.865073 4781 scope.go:117] "RemoveContainer" containerID="a2249c4808b6f899fa26b9e959273772b8cbaffed17df1a2ae79f858196599ac" Dec 08 21:10:25 crc kubenswrapper[4781]: E1208 21:10:25.865689 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2249c4808b6f899fa26b9e959273772b8cbaffed17df1a2ae79f858196599ac\": container with ID starting with a2249c4808b6f899fa26b9e959273772b8cbaffed17df1a2ae79f858196599ac not found: ID does not exist" containerID="a2249c4808b6f899fa26b9e959273772b8cbaffed17df1a2ae79f858196599ac" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.865730 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2249c4808b6f899fa26b9e959273772b8cbaffed17df1a2ae79f858196599ac"} err="failed to get container status \"a2249c4808b6f899fa26b9e959273772b8cbaffed17df1a2ae79f858196599ac\": rpc error: code = NotFound desc = could not find container \"a2249c4808b6f899fa26b9e959273772b8cbaffed17df1a2ae79f858196599ac\": container with ID starting with a2249c4808b6f899fa26b9e959273772b8cbaffed17df1a2ae79f858196599ac not found: ID does not exist" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.865748 4781 scope.go:117] "RemoveContainer" containerID="de0aa83686bebb5b52d3239e874e07b3c05eb797021492662bd7caa4359b4ca5" Dec 08 21:10:25 crc kubenswrapper[4781]: E1208 21:10:25.866063 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de0aa83686bebb5b52d3239e874e07b3c05eb797021492662bd7caa4359b4ca5\": container with ID starting with de0aa83686bebb5b52d3239e874e07b3c05eb797021492662bd7caa4359b4ca5 not found: ID does not exist" containerID="de0aa83686bebb5b52d3239e874e07b3c05eb797021492662bd7caa4359b4ca5" Dec 08 21:10:25 crc kubenswrapper[4781]: I1208 21:10:25.866088 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de0aa83686bebb5b52d3239e874e07b3c05eb797021492662bd7caa4359b4ca5"} err="failed to get container status \"de0aa83686bebb5b52d3239e874e07b3c05eb797021492662bd7caa4359b4ca5\": rpc error: code = NotFound desc = could not find container \"de0aa83686bebb5b52d3239e874e07b3c05eb797021492662bd7caa4359b4ca5\": container with ID starting with de0aa83686bebb5b52d3239e874e07b3c05eb797021492662bd7caa4359b4ca5 not found: ID does not exist" Dec 08 21:10:26 crc kubenswrapper[4781]: I1208 21:10:26.139223 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1a78d36-3704-481d-aff1-7c14854705f9" path="/var/lib/kubelet/pods/c1a78d36-3704-481d-aff1-7c14854705f9/volumes" Dec 08 21:10:28 crc kubenswrapper[4781]: I1208 21:10:28.127056 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:10:28 crc kubenswrapper[4781]: E1208 21:10:28.127875 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.055489 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nc4bf"] Dec 08 21:10:32 crc kubenswrapper[4781]: E1208 21:10:32.056427 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a78d36-3704-481d-aff1-7c14854705f9" containerName="extract-content" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.056443 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a78d36-3704-481d-aff1-7c14854705f9" containerName="extract-content" Dec 08 21:10:32 crc kubenswrapper[4781]: E1208 21:10:32.056455 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a78d36-3704-481d-aff1-7c14854705f9" containerName="registry-server" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.056461 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a78d36-3704-481d-aff1-7c14854705f9" containerName="registry-server" Dec 08 21:10:32 crc kubenswrapper[4781]: E1208 21:10:32.056501 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a78d36-3704-481d-aff1-7c14854705f9" containerName="extract-utilities" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.056512 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a78d36-3704-481d-aff1-7c14854705f9" containerName="extract-utilities" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.056766 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1a78d36-3704-481d-aff1-7c14854705f9" containerName="registry-server" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.058191 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.071115 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nc4bf"] Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.165818 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-catalog-content\") pod \"community-operators-nc4bf\" (UID: \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\") " pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.165890 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-utilities\") pod \"community-operators-nc4bf\" (UID: \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\") " pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.166020 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr5jz\" (UniqueName: \"kubernetes.io/projected/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-kube-api-access-wr5jz\") pod \"community-operators-nc4bf\" (UID: \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\") " pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.268052 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-catalog-content\") pod \"community-operators-nc4bf\" (UID: \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\") " pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.268138 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-utilities\") pod \"community-operators-nc4bf\" (UID: \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\") " pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.268280 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr5jz\" (UniqueName: \"kubernetes.io/projected/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-kube-api-access-wr5jz\") pod \"community-operators-nc4bf\" (UID: \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\") " pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.268785 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-catalog-content\") pod \"community-operators-nc4bf\" (UID: \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\") " pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.269768 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-utilities\") pod \"community-operators-nc4bf\" (UID: \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\") " pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.310739 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr5jz\" (UniqueName: \"kubernetes.io/projected/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-kube-api-access-wr5jz\") pod \"community-operators-nc4bf\" (UID: \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\") " pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.388898 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:32 crc kubenswrapper[4781]: I1208 21:10:32.970660 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nc4bf"] Dec 08 21:10:33 crc kubenswrapper[4781]: I1208 21:10:33.839505 4781 generic.go:334] "Generic (PLEG): container finished" podID="84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" containerID="355f85da88b0a9d1f3e9ea064045626ea25c0742115190839bdfab8965e3facb" exitCode=0 Dec 08 21:10:33 crc kubenswrapper[4781]: I1208 21:10:33.839602 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nc4bf" event={"ID":"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08","Type":"ContainerDied","Data":"355f85da88b0a9d1f3e9ea064045626ea25c0742115190839bdfab8965e3facb"} Dec 08 21:10:33 crc kubenswrapper[4781]: I1208 21:10:33.839793 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nc4bf" event={"ID":"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08","Type":"ContainerStarted","Data":"334c2cef89c1da14fe0a254cbbfcbb5a0a4403785fb29add0e5d8d05f3db45b1"} Dec 08 21:10:33 crc kubenswrapper[4781]: I1208 21:10:33.842901 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 21:10:34 crc kubenswrapper[4781]: I1208 21:10:34.893007 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nc4bf" event={"ID":"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08","Type":"ContainerStarted","Data":"c2f0e83e2db9c80d414426b38a05c999e7d1cc109275b1216b836e8b6d15efe8"} Dec 08 21:10:35 crc kubenswrapper[4781]: I1208 21:10:35.903647 4781 generic.go:334] "Generic (PLEG): container finished" podID="84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" containerID="c2f0e83e2db9c80d414426b38a05c999e7d1cc109275b1216b836e8b6d15efe8" exitCode=0 Dec 08 21:10:35 crc kubenswrapper[4781]: I1208 21:10:35.903705 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nc4bf" event={"ID":"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08","Type":"ContainerDied","Data":"c2f0e83e2db9c80d414426b38a05c999e7d1cc109275b1216b836e8b6d15efe8"} Dec 08 21:10:36 crc kubenswrapper[4781]: I1208 21:10:36.917134 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nc4bf" event={"ID":"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08","Type":"ContainerStarted","Data":"7d05f698ced240742e084fe4192aa9d7fd2c165dd3e00d30c36125beb4a86398"} Dec 08 21:10:36 crc kubenswrapper[4781]: I1208 21:10:36.938354 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nc4bf" podStartSLOduration=2.456847785 podStartE2EDuration="4.938338999s" podCreationTimestamp="2025-12-08 21:10:32 +0000 UTC" firstStartedPulling="2025-12-08 21:10:33.842654038 +0000 UTC m=+3949.993937415" lastFinishedPulling="2025-12-08 21:10:36.324145252 +0000 UTC m=+3952.475428629" observedRunningTime="2025-12-08 21:10:36.936309661 +0000 UTC m=+3953.087593038" watchObservedRunningTime="2025-12-08 21:10:36.938338999 +0000 UTC m=+3953.089622376" Dec 08 21:10:42 crc kubenswrapper[4781]: I1208 21:10:42.126517 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:10:42 crc kubenswrapper[4781]: E1208 21:10:42.127277 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:10:42 crc kubenswrapper[4781]: I1208 21:10:42.389615 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:42 crc kubenswrapper[4781]: I1208 21:10:42.389677 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:42 crc kubenswrapper[4781]: I1208 21:10:42.449396 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:43 crc kubenswrapper[4781]: I1208 21:10:43.065902 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:43 crc kubenswrapper[4781]: I1208 21:10:43.111117 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nc4bf"] Dec 08 21:10:45 crc kubenswrapper[4781]: I1208 21:10:45.038404 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nc4bf" podUID="84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" containerName="registry-server" containerID="cri-o://7d05f698ced240742e084fe4192aa9d7fd2c165dd3e00d30c36125beb4a86398" gracePeriod=2 Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.048260 4781 generic.go:334] "Generic (PLEG): container finished" podID="84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" containerID="7d05f698ced240742e084fe4192aa9d7fd2c165dd3e00d30c36125beb4a86398" exitCode=0 Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.048334 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nc4bf" event={"ID":"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08","Type":"ContainerDied","Data":"7d05f698ced240742e084fe4192aa9d7fd2c165dd3e00d30c36125beb4a86398"} Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.048636 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nc4bf" event={"ID":"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08","Type":"ContainerDied","Data":"334c2cef89c1da14fe0a254cbbfcbb5a0a4403785fb29add0e5d8d05f3db45b1"} Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.048654 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="334c2cef89c1da14fe0a254cbbfcbb5a0a4403785fb29add0e5d8d05f3db45b1" Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.050269 4781 generic.go:334] "Generic (PLEG): container finished" podID="96b2dd97-8174-435c-85d0-02c2d39239fd" containerID="e5f3756745fb4b475b8612b1169575c0dc68f5651ff2e5840c24683e978d473b" exitCode=0 Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.050288 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-455h5/crc-debug-dclth" event={"ID":"96b2dd97-8174-435c-85d0-02c2d39239fd","Type":"ContainerDied","Data":"e5f3756745fb4b475b8612b1169575c0dc68f5651ff2e5840c24683e978d473b"} Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.142183 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.321566 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-utilities\") pod \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\" (UID: \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\") " Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.321968 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-catalog-content\") pod \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\" (UID: \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\") " Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.322017 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr5jz\" (UniqueName: \"kubernetes.io/projected/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-kube-api-access-wr5jz\") pod \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\" (UID: \"84999d9a-aba9-4aee-8ee2-3b1eee2e6a08\") " Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.322980 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-utilities" (OuterVolumeSpecName: "utilities") pod "84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" (UID: "84999d9a-aba9-4aee-8ee2-3b1eee2e6a08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.335666 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-kube-api-access-wr5jz" (OuterVolumeSpecName: "kube-api-access-wr5jz") pod "84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" (UID: "84999d9a-aba9-4aee-8ee2-3b1eee2e6a08"). InnerVolumeSpecName "kube-api-access-wr5jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.384628 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" (UID: "84999d9a-aba9-4aee-8ee2-3b1eee2e6a08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.423753 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.423781 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 21:10:46 crc kubenswrapper[4781]: I1208 21:10:46.423793 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr5jz\" (UniqueName: \"kubernetes.io/projected/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08-kube-api-access-wr5jz\") on node \"crc\" DevicePath \"\"" Dec 08 21:10:47 crc kubenswrapper[4781]: I1208 21:10:47.057949 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nc4bf" Dec 08 21:10:47 crc kubenswrapper[4781]: I1208 21:10:47.181705 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/crc-debug-dclth" Dec 08 21:10:47 crc kubenswrapper[4781]: I1208 21:10:47.195533 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nc4bf"] Dec 08 21:10:47 crc kubenswrapper[4781]: I1208 21:10:47.207079 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nc4bf"] Dec 08 21:10:47 crc kubenswrapper[4781]: I1208 21:10:47.236971 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-455h5/crc-debug-dclth"] Dec 08 21:10:47 crc kubenswrapper[4781]: I1208 21:10:47.245015 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-455h5/crc-debug-dclth"] Dec 08 21:10:47 crc kubenswrapper[4781]: I1208 21:10:47.337758 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlffr\" (UniqueName: \"kubernetes.io/projected/96b2dd97-8174-435c-85d0-02c2d39239fd-kube-api-access-qlffr\") pod \"96b2dd97-8174-435c-85d0-02c2d39239fd\" (UID: \"96b2dd97-8174-435c-85d0-02c2d39239fd\") " Dec 08 21:10:47 crc kubenswrapper[4781]: I1208 21:10:47.337836 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b2dd97-8174-435c-85d0-02c2d39239fd-host\") pod \"96b2dd97-8174-435c-85d0-02c2d39239fd\" (UID: \"96b2dd97-8174-435c-85d0-02c2d39239fd\") " Dec 08 21:10:47 crc kubenswrapper[4781]: I1208 21:10:47.338315 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96b2dd97-8174-435c-85d0-02c2d39239fd-host" (OuterVolumeSpecName: "host") pod "96b2dd97-8174-435c-85d0-02c2d39239fd" (UID: "96b2dd97-8174-435c-85d0-02c2d39239fd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 21:10:47 crc kubenswrapper[4781]: I1208 21:10:47.346167 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b2dd97-8174-435c-85d0-02c2d39239fd-kube-api-access-qlffr" (OuterVolumeSpecName: "kube-api-access-qlffr") pod "96b2dd97-8174-435c-85d0-02c2d39239fd" (UID: "96b2dd97-8174-435c-85d0-02c2d39239fd"). InnerVolumeSpecName "kube-api-access-qlffr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:10:47 crc kubenswrapper[4781]: I1208 21:10:47.440006 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlffr\" (UniqueName: \"kubernetes.io/projected/96b2dd97-8174-435c-85d0-02c2d39239fd-kube-api-access-qlffr\") on node \"crc\" DevicePath \"\"" Dec 08 21:10:47 crc kubenswrapper[4781]: I1208 21:10:47.440038 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b2dd97-8174-435c-85d0-02c2d39239fd-host\") on node \"crc\" DevicePath \"\"" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.068509 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41be928cd9944d0de9c8904a8c03c6f5ff67c18a6910021d1f5aa0f901cfb925" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.068856 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/crc-debug-dclth" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.135664 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" path="/var/lib/kubelet/pods/84999d9a-aba9-4aee-8ee2-3b1eee2e6a08/volumes" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.136407 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b2dd97-8174-435c-85d0-02c2d39239fd" path="/var/lib/kubelet/pods/96b2dd97-8174-435c-85d0-02c2d39239fd/volumes" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.426740 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-455h5/crc-debug-tdqft"] Dec 08 21:10:48 crc kubenswrapper[4781]: E1208 21:10:48.428023 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" containerName="registry-server" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.428049 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" containerName="registry-server" Dec 08 21:10:48 crc kubenswrapper[4781]: E1208 21:10:48.428070 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b2dd97-8174-435c-85d0-02c2d39239fd" containerName="container-00" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.428079 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b2dd97-8174-435c-85d0-02c2d39239fd" containerName="container-00" Dec 08 21:10:48 crc kubenswrapper[4781]: E1208 21:10:48.428100 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" containerName="extract-utilities" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.428110 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" containerName="extract-utilities" Dec 08 21:10:48 crc kubenswrapper[4781]: E1208 21:10:48.428137 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" containerName="extract-content" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.428146 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" containerName="extract-content" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.428381 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="84999d9a-aba9-4aee-8ee2-3b1eee2e6a08" containerName="registry-server" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.428417 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b2dd97-8174-435c-85d0-02c2d39239fd" containerName="container-00" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.429316 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/crc-debug-tdqft" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.461939 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62k4v\" (UniqueName: \"kubernetes.io/projected/f6d371ad-d972-418e-ad19-780bf86e666e-kube-api-access-62k4v\") pod \"crc-debug-tdqft\" (UID: \"f6d371ad-d972-418e-ad19-780bf86e666e\") " pod="openshift-must-gather-455h5/crc-debug-tdqft" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.461994 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6d371ad-d972-418e-ad19-780bf86e666e-host\") pod \"crc-debug-tdqft\" (UID: \"f6d371ad-d972-418e-ad19-780bf86e666e\") " pod="openshift-must-gather-455h5/crc-debug-tdqft" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.563991 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62k4v\" (UniqueName: \"kubernetes.io/projected/f6d371ad-d972-418e-ad19-780bf86e666e-kube-api-access-62k4v\") pod \"crc-debug-tdqft\" (UID: \"f6d371ad-d972-418e-ad19-780bf86e666e\") " pod="openshift-must-gather-455h5/crc-debug-tdqft" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.564041 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6d371ad-d972-418e-ad19-780bf86e666e-host\") pod \"crc-debug-tdqft\" (UID: \"f6d371ad-d972-418e-ad19-780bf86e666e\") " pod="openshift-must-gather-455h5/crc-debug-tdqft" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.564289 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6d371ad-d972-418e-ad19-780bf86e666e-host\") pod \"crc-debug-tdqft\" (UID: \"f6d371ad-d972-418e-ad19-780bf86e666e\") " pod="openshift-must-gather-455h5/crc-debug-tdqft" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.603103 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62k4v\" (UniqueName: \"kubernetes.io/projected/f6d371ad-d972-418e-ad19-780bf86e666e-kube-api-access-62k4v\") pod \"crc-debug-tdqft\" (UID: \"f6d371ad-d972-418e-ad19-780bf86e666e\") " pod="openshift-must-gather-455h5/crc-debug-tdqft" Dec 08 21:10:48 crc kubenswrapper[4781]: I1208 21:10:48.745024 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/crc-debug-tdqft" Dec 08 21:10:48 crc kubenswrapper[4781]: W1208 21:10:48.778837 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6d371ad_d972_418e_ad19_780bf86e666e.slice/crio-2a96aa52cb5043cde4b7498dc8f18ac8971be060fe2af65d6b57027a845d65df WatchSource:0}: Error finding container 2a96aa52cb5043cde4b7498dc8f18ac8971be060fe2af65d6b57027a845d65df: Status 404 returned error can't find the container with id 2a96aa52cb5043cde4b7498dc8f18ac8971be060fe2af65d6b57027a845d65df Dec 08 21:10:49 crc kubenswrapper[4781]: I1208 21:10:49.083279 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-455h5/crc-debug-tdqft" event={"ID":"f6d371ad-d972-418e-ad19-780bf86e666e","Type":"ContainerStarted","Data":"ba1830a5065da7fd6fc35bdf8e50bda7a058760e1623aab89eca71f95faa39da"} Dec 08 21:10:49 crc kubenswrapper[4781]: I1208 21:10:49.083334 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-455h5/crc-debug-tdqft" event={"ID":"f6d371ad-d972-418e-ad19-780bf86e666e","Type":"ContainerStarted","Data":"2a96aa52cb5043cde4b7498dc8f18ac8971be060fe2af65d6b57027a845d65df"} Dec 08 21:10:49 crc kubenswrapper[4781]: I1208 21:10:49.126298 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-455h5/crc-debug-tdqft" podStartSLOduration=1.126066765 podStartE2EDuration="1.126066765s" podCreationTimestamp="2025-12-08 21:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-08 21:10:49.095553273 +0000 UTC m=+3965.246836660" watchObservedRunningTime="2025-12-08 21:10:49.126066765 +0000 UTC m=+3965.277350152" Dec 08 21:10:50 crc kubenswrapper[4781]: I1208 21:10:50.094705 4781 generic.go:334] "Generic (PLEG): container finished" podID="f6d371ad-d972-418e-ad19-780bf86e666e" containerID="ba1830a5065da7fd6fc35bdf8e50bda7a058760e1623aab89eca71f95faa39da" exitCode=0 Dec 08 21:10:50 crc kubenswrapper[4781]: I1208 21:10:50.094759 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-455h5/crc-debug-tdqft" event={"ID":"f6d371ad-d972-418e-ad19-780bf86e666e","Type":"ContainerDied","Data":"ba1830a5065da7fd6fc35bdf8e50bda7a058760e1623aab89eca71f95faa39da"} Dec 08 21:10:51 crc kubenswrapper[4781]: I1208 21:10:51.212837 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/crc-debug-tdqft" Dec 08 21:10:51 crc kubenswrapper[4781]: I1208 21:10:51.253857 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-455h5/crc-debug-tdqft"] Dec 08 21:10:51 crc kubenswrapper[4781]: I1208 21:10:51.260839 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-455h5/crc-debug-tdqft"] Dec 08 21:10:51 crc kubenswrapper[4781]: I1208 21:10:51.416141 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6d371ad-d972-418e-ad19-780bf86e666e-host" (OuterVolumeSpecName: "host") pod "f6d371ad-d972-418e-ad19-780bf86e666e" (UID: "f6d371ad-d972-418e-ad19-780bf86e666e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 21:10:51 crc kubenswrapper[4781]: I1208 21:10:51.416955 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6d371ad-d972-418e-ad19-780bf86e666e-host\") pod \"f6d371ad-d972-418e-ad19-780bf86e666e\" (UID: \"f6d371ad-d972-418e-ad19-780bf86e666e\") " Dec 08 21:10:51 crc kubenswrapper[4781]: I1208 21:10:51.417159 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62k4v\" (UniqueName: \"kubernetes.io/projected/f6d371ad-d972-418e-ad19-780bf86e666e-kube-api-access-62k4v\") pod \"f6d371ad-d972-418e-ad19-780bf86e666e\" (UID: \"f6d371ad-d972-418e-ad19-780bf86e666e\") " Dec 08 21:10:51 crc kubenswrapper[4781]: I1208 21:10:51.417782 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f6d371ad-d972-418e-ad19-780bf86e666e-host\") on node \"crc\" DevicePath \"\"" Dec 08 21:10:51 crc kubenswrapper[4781]: I1208 21:10:51.422505 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d371ad-d972-418e-ad19-780bf86e666e-kube-api-access-62k4v" (OuterVolumeSpecName: "kube-api-access-62k4v") pod "f6d371ad-d972-418e-ad19-780bf86e666e" (UID: "f6d371ad-d972-418e-ad19-780bf86e666e"). InnerVolumeSpecName "kube-api-access-62k4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:10:51 crc kubenswrapper[4781]: I1208 21:10:51.518637 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62k4v\" (UniqueName: \"kubernetes.io/projected/f6d371ad-d972-418e-ad19-780bf86e666e-kube-api-access-62k4v\") on node \"crc\" DevicePath \"\"" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.118737 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a96aa52cb5043cde4b7498dc8f18ac8971be060fe2af65d6b57027a845d65df" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.118847 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/crc-debug-tdqft" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.154713 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d371ad-d972-418e-ad19-780bf86e666e" path="/var/lib/kubelet/pods/f6d371ad-d972-418e-ad19-780bf86e666e/volumes" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.417000 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-455h5/crc-debug-cpjzd"] Dec 08 21:10:52 crc kubenswrapper[4781]: E1208 21:10:52.417431 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d371ad-d972-418e-ad19-780bf86e666e" containerName="container-00" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.417446 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d371ad-d972-418e-ad19-780bf86e666e" containerName="container-00" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.417718 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6d371ad-d972-418e-ad19-780bf86e666e" containerName="container-00" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.418433 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/crc-debug-cpjzd" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.537600 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6d433b3-df5d-41af-9ecc-07c55cfcf706-host\") pod \"crc-debug-cpjzd\" (UID: \"e6d433b3-df5d-41af-9ecc-07c55cfcf706\") " pod="openshift-must-gather-455h5/crc-debug-cpjzd" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.537699 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59xjk\" (UniqueName: \"kubernetes.io/projected/e6d433b3-df5d-41af-9ecc-07c55cfcf706-kube-api-access-59xjk\") pod \"crc-debug-cpjzd\" (UID: \"e6d433b3-df5d-41af-9ecc-07c55cfcf706\") " pod="openshift-must-gather-455h5/crc-debug-cpjzd" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.640045 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59xjk\" (UniqueName: \"kubernetes.io/projected/e6d433b3-df5d-41af-9ecc-07c55cfcf706-kube-api-access-59xjk\") pod \"crc-debug-cpjzd\" (UID: \"e6d433b3-df5d-41af-9ecc-07c55cfcf706\") " pod="openshift-must-gather-455h5/crc-debug-cpjzd" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.640284 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6d433b3-df5d-41af-9ecc-07c55cfcf706-host\") pod \"crc-debug-cpjzd\" (UID: \"e6d433b3-df5d-41af-9ecc-07c55cfcf706\") " pod="openshift-must-gather-455h5/crc-debug-cpjzd" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.640420 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6d433b3-df5d-41af-9ecc-07c55cfcf706-host\") pod \"crc-debug-cpjzd\" (UID: \"e6d433b3-df5d-41af-9ecc-07c55cfcf706\") " pod="openshift-must-gather-455h5/crc-debug-cpjzd" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.656613 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59xjk\" (UniqueName: \"kubernetes.io/projected/e6d433b3-df5d-41af-9ecc-07c55cfcf706-kube-api-access-59xjk\") pod \"crc-debug-cpjzd\" (UID: \"e6d433b3-df5d-41af-9ecc-07c55cfcf706\") " pod="openshift-must-gather-455h5/crc-debug-cpjzd" Dec 08 21:10:52 crc kubenswrapper[4781]: I1208 21:10:52.750260 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/crc-debug-cpjzd" Dec 08 21:10:52 crc kubenswrapper[4781]: W1208 21:10:52.783891 4781 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6d433b3_df5d_41af_9ecc_07c55cfcf706.slice/crio-8b5d95790962f11ad84357de2d491ea4921d4d9031f527f3628a55b397b37b23 WatchSource:0}: Error finding container 8b5d95790962f11ad84357de2d491ea4921d4d9031f527f3628a55b397b37b23: Status 404 returned error can't find the container with id 8b5d95790962f11ad84357de2d491ea4921d4d9031f527f3628a55b397b37b23 Dec 08 21:10:53 crc kubenswrapper[4781]: I1208 21:10:53.129760 4781 generic.go:334] "Generic (PLEG): container finished" podID="e6d433b3-df5d-41af-9ecc-07c55cfcf706" containerID="e8f25b2620fbc8e3ddd4b304fd931e32b8c90cbfce0997cb6bb8870004ccc7a6" exitCode=0 Dec 08 21:10:53 crc kubenswrapper[4781]: I1208 21:10:53.129808 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-455h5/crc-debug-cpjzd" event={"ID":"e6d433b3-df5d-41af-9ecc-07c55cfcf706","Type":"ContainerDied","Data":"e8f25b2620fbc8e3ddd4b304fd931e32b8c90cbfce0997cb6bb8870004ccc7a6"} Dec 08 21:10:53 crc kubenswrapper[4781]: I1208 21:10:53.130071 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-455h5/crc-debug-cpjzd" event={"ID":"e6d433b3-df5d-41af-9ecc-07c55cfcf706","Type":"ContainerStarted","Data":"8b5d95790962f11ad84357de2d491ea4921d4d9031f527f3628a55b397b37b23"} Dec 08 21:10:53 crc kubenswrapper[4781]: I1208 21:10:53.174646 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-455h5/crc-debug-cpjzd"] Dec 08 21:10:53 crc kubenswrapper[4781]: I1208 21:10:53.184118 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-455h5/crc-debug-cpjzd"] Dec 08 21:10:54 crc kubenswrapper[4781]: I1208 21:10:54.271315 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/crc-debug-cpjzd" Dec 08 21:10:54 crc kubenswrapper[4781]: I1208 21:10:54.384246 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6d433b3-df5d-41af-9ecc-07c55cfcf706-host\") pod \"e6d433b3-df5d-41af-9ecc-07c55cfcf706\" (UID: \"e6d433b3-df5d-41af-9ecc-07c55cfcf706\") " Dec 08 21:10:54 crc kubenswrapper[4781]: I1208 21:10:54.384414 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6d433b3-df5d-41af-9ecc-07c55cfcf706-host" (OuterVolumeSpecName: "host") pod "e6d433b3-df5d-41af-9ecc-07c55cfcf706" (UID: "e6d433b3-df5d-41af-9ecc-07c55cfcf706"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 08 21:10:54 crc kubenswrapper[4781]: I1208 21:10:54.384804 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59xjk\" (UniqueName: \"kubernetes.io/projected/e6d433b3-df5d-41af-9ecc-07c55cfcf706-kube-api-access-59xjk\") pod \"e6d433b3-df5d-41af-9ecc-07c55cfcf706\" (UID: \"e6d433b3-df5d-41af-9ecc-07c55cfcf706\") " Dec 08 21:10:54 crc kubenswrapper[4781]: I1208 21:10:54.385552 4781 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e6d433b3-df5d-41af-9ecc-07c55cfcf706-host\") on node \"crc\" DevicePath \"\"" Dec 08 21:10:54 crc kubenswrapper[4781]: I1208 21:10:54.390271 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d433b3-df5d-41af-9ecc-07c55cfcf706-kube-api-access-59xjk" (OuterVolumeSpecName: "kube-api-access-59xjk") pod "e6d433b3-df5d-41af-9ecc-07c55cfcf706" (UID: "e6d433b3-df5d-41af-9ecc-07c55cfcf706"). InnerVolumeSpecName "kube-api-access-59xjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:10:54 crc kubenswrapper[4781]: I1208 21:10:54.487160 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59xjk\" (UniqueName: \"kubernetes.io/projected/e6d433b3-df5d-41af-9ecc-07c55cfcf706-kube-api-access-59xjk\") on node \"crc\" DevicePath \"\"" Dec 08 21:10:55 crc kubenswrapper[4781]: I1208 21:10:55.126523 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:10:55 crc kubenswrapper[4781]: E1208 21:10:55.126780 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:10:55 crc kubenswrapper[4781]: I1208 21:10:55.167285 4781 scope.go:117] "RemoveContainer" containerID="e8f25b2620fbc8e3ddd4b304fd931e32b8c90cbfce0997cb6bb8870004ccc7a6" Dec 08 21:10:55 crc kubenswrapper[4781]: I1208 21:10:55.167318 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/crc-debug-cpjzd" Dec 08 21:10:56 crc kubenswrapper[4781]: I1208 21:10:56.145608 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d433b3-df5d-41af-9ecc-07c55cfcf706" path="/var/lib/kubelet/pods/e6d433b3-df5d-41af-9ecc-07c55cfcf706/volumes" Dec 08 21:11:06 crc kubenswrapper[4781]: I1208 21:11:06.129559 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:11:06 crc kubenswrapper[4781]: E1208 21:11:06.130275 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:11:18 crc kubenswrapper[4781]: I1208 21:11:18.521903 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56858f8966-hkmxw_ed36d878-8827-467c-95a0-1450798ad50e/barbican-api/0.log" Dec 08 21:11:18 crc kubenswrapper[4781]: I1208 21:11:18.658081 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56858f8966-hkmxw_ed36d878-8827-467c-95a0-1450798ad50e/barbican-api-log/0.log" Dec 08 21:11:18 crc kubenswrapper[4781]: I1208 21:11:18.720590 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6d674cb658-5bj7r_dcaa98ab-e000-41ef-bf28-189680138c66/barbican-keystone-listener/0.log" Dec 08 21:11:18 crc kubenswrapper[4781]: I1208 21:11:18.785637 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6d674cb658-5bj7r_dcaa98ab-e000-41ef-bf28-189680138c66/barbican-keystone-listener-log/0.log" Dec 08 21:11:18 crc kubenswrapper[4781]: I1208 21:11:18.986905 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-676d96fbc7-6xfzf_558023e1-d94c-4422-a958-796ba9bf387f/barbican-worker/0.log" Dec 08 21:11:19 crc kubenswrapper[4781]: I1208 21:11:19.082320 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-676d96fbc7-6xfzf_558023e1-d94c-4422-a958-796ba9bf387f/barbican-worker-log/0.log" Dec 08 21:11:19 crc kubenswrapper[4781]: I1208 21:11:19.170471 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-qgsdq_3eca1a1d-a60c-4911-9cf8-fd8a82f9541c/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:19 crc kubenswrapper[4781]: I1208 21:11:19.339063 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9447ae7c-50db-46a0-aeec-7718944d900e/ceilometer-notification-agent/0.log" Dec 08 21:11:19 crc kubenswrapper[4781]: I1208 21:11:19.341292 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9447ae7c-50db-46a0-aeec-7718944d900e/ceilometer-central-agent/0.log" Dec 08 21:11:19 crc kubenswrapper[4781]: I1208 21:11:19.404585 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9447ae7c-50db-46a0-aeec-7718944d900e/proxy-httpd/0.log" Dec 08 21:11:19 crc kubenswrapper[4781]: I1208 21:11:19.459879 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_9447ae7c-50db-46a0-aeec-7718944d900e/sg-core/0.log" Dec 08 21:11:19 crc kubenswrapper[4781]: I1208 21:11:19.582376 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5c1478d5-3e67-4d45-b4c3-d5612e46db8d/cinder-api-log/0.log" Dec 08 21:11:19 crc kubenswrapper[4781]: I1208 21:11:19.587642 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_5c1478d5-3e67-4d45-b4c3-d5612e46db8d/cinder-api/0.log" Dec 08 21:11:19 crc kubenswrapper[4781]: I1208 21:11:19.769110 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ce9b542d-77aa-4a8d-95fe-393a5a0dafa2/probe/0.log" Dec 08 21:11:19 crc kubenswrapper[4781]: I1208 21:11:19.847166 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_ce9b542d-77aa-4a8d-95fe-393a5a0dafa2/cinder-scheduler/0.log" Dec 08 21:11:19 crc kubenswrapper[4781]: I1208 21:11:19.910640 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-b5s5d_b174f924-296a-45a3-b80b-fdec0f219fa8/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:20 crc kubenswrapper[4781]: I1208 21:11:20.000224 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hbzns_c7f682bd-1ad1-4917-8c54-7f76ef956f09/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:20 crc kubenswrapper[4781]: I1208 21:11:20.103117 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6574f55bb5-rdnzp_11821178-db83-4950-8900-5f6fcc68f184/init/0.log" Dec 08 21:11:20 crc kubenswrapper[4781]: I1208 21:11:20.127747 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:11:20 crc kubenswrapper[4781]: E1208 21:11:20.128007 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:11:20 crc kubenswrapper[4781]: I1208 21:11:20.273557 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6574f55bb5-rdnzp_11821178-db83-4950-8900-5f6fcc68f184/init/0.log" Dec 08 21:11:20 crc kubenswrapper[4781]: I1208 21:11:20.334384 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mhwdk_aa022451-2529-456c-99bf-9c36b807312e/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:20 crc kubenswrapper[4781]: I1208 21:11:20.356862 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6574f55bb5-rdnzp_11821178-db83-4950-8900-5f6fcc68f184/dnsmasq-dns/0.log" Dec 08 21:11:20 crc kubenswrapper[4781]: I1208 21:11:20.518441 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8238f3d8-03ef-4f26-a327-0f9e931aa7a6/glance-log/0.log" Dec 08 21:11:20 crc kubenswrapper[4781]: I1208 21:11:20.528246 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8238f3d8-03ef-4f26-a327-0f9e931aa7a6/glance-httpd/0.log" Dec 08 21:11:20 crc kubenswrapper[4781]: I1208 21:11:20.694216 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_901250f8-f46c-49d0-83d3-3b0c7affea54/glance-httpd/0.log" Dec 08 21:11:20 crc kubenswrapper[4781]: I1208 21:11:20.714231 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_901250f8-f46c-49d0-83d3-3b0c7affea54/glance-log/0.log" Dec 08 21:11:20 crc kubenswrapper[4781]: I1208 21:11:20.856398 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-75dd9f77c4-85lhw_eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf/horizon/0.log" Dec 08 21:11:21 crc kubenswrapper[4781]: I1208 21:11:21.055555 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-z64vf_98d6ed01-20fc-4e72-a8cf-2e53a8e6103e/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:21 crc kubenswrapper[4781]: I1208 21:11:21.240257 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-p9qr6_c08d5f1c-63e8-4974-b50c-29b0e8db5e9a/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:21 crc kubenswrapper[4781]: I1208 21:11:21.280718 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-75dd9f77c4-85lhw_eb3c8f73-65cf-4aa0-8c87-658aaaa1fcaf/horizon-log/0.log" Dec 08 21:11:21 crc kubenswrapper[4781]: I1208 21:11:21.480016 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29420461-q4qhq_84bf4f32-efbb-4625-9863-20c2b169937a/keystone-cron/0.log" Dec 08 21:11:21 crc kubenswrapper[4781]: I1208 21:11:21.528355 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-796ff6864f-m6jf6_47dda6bf-1ed5-4df0-a647-32e518a7514f/keystone-api/0.log" Dec 08 21:11:21 crc kubenswrapper[4781]: I1208 21:11:21.653076 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_acc0a735-094b-4857-8238-5240530c62dc/kube-state-metrics/0.log" Dec 08 21:11:21 crc kubenswrapper[4781]: I1208 21:11:21.716282 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-496zk_0d819560-5e37-4cbe-8276-f5c63dd9610c/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:22 crc kubenswrapper[4781]: I1208 21:11:22.042008 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b4c587749-kmfjj_b3890267-5a89-4612-89f0-1bb7ba0e1245/neutron-httpd/0.log" Dec 08 21:11:22 crc kubenswrapper[4781]: I1208 21:11:22.042891 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b4c587749-kmfjj_b3890267-5a89-4612-89f0-1bb7ba0e1245/neutron-api/0.log" Dec 08 21:11:22 crc kubenswrapper[4781]: I1208 21:11:22.088838 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-6wcqh_f0d82d95-8bf8-4845-a305-cca05358ffdb/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:22 crc kubenswrapper[4781]: I1208 21:11:22.602535 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_cba17aa8-5de4-4747-bc12-50b1d1b66490/nova-api-log/0.log" Dec 08 21:11:22 crc kubenswrapper[4781]: I1208 21:11:22.642006 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5f31e53b-8234-4f29-bfcf-d3c037103945/nova-cell0-conductor-conductor/0.log" Dec 08 21:11:23 crc kubenswrapper[4781]: I1208 21:11:23.021582 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_af1b29f3-e2a8-4a48-9d18-8c502c3f435c/nova-cell1-conductor-conductor/0.log" Dec 08 21:11:23 crc kubenswrapper[4781]: I1208 21:11:23.053143 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_28fbba80-b3c1-45f6-ad0a-3435a48fd033/nova-cell1-novncproxy-novncproxy/0.log" Dec 08 21:11:23 crc kubenswrapper[4781]: I1208 21:11:23.061048 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_cba17aa8-5de4-4747-bc12-50b1d1b66490/nova-api-api/0.log" Dec 08 21:11:23 crc kubenswrapper[4781]: I1208 21:11:23.225338 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-5rsvg_69ce8819-1a24-4b28-9438-c92c07b4dbca/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:23 crc kubenswrapper[4781]: I1208 21:11:23.408060 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2348d949-d7f2-44a5-8e47-58b358d060c8/nova-metadata-log/0.log" Dec 08 21:11:23 crc kubenswrapper[4781]: I1208 21:11:23.760653 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_95c2c3c0-0733-4bac-bf28-0805d8c9a499/mysql-bootstrap/0.log" Dec 08 21:11:23 crc kubenswrapper[4781]: I1208 21:11:23.849608 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_daf2aa43-ddc7-4618-a3df-665b947b68bd/nova-scheduler-scheduler/0.log" Dec 08 21:11:23 crc kubenswrapper[4781]: I1208 21:11:23.876778 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_95c2c3c0-0733-4bac-bf28-0805d8c9a499/mysql-bootstrap/0.log" Dec 08 21:11:23 crc kubenswrapper[4781]: I1208 21:11:23.948130 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_95c2c3c0-0733-4bac-bf28-0805d8c9a499/galera/0.log" Dec 08 21:11:24 crc kubenswrapper[4781]: I1208 21:11:24.069784 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_88416662-c07f-4d9f-b9cb-7f92d21aaa6f/mysql-bootstrap/0.log" Dec 08 21:11:24 crc kubenswrapper[4781]: I1208 21:11:24.281779 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_88416662-c07f-4d9f-b9cb-7f92d21aaa6f/mysql-bootstrap/0.log" Dec 08 21:11:24 crc kubenswrapper[4781]: I1208 21:11:24.354011 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_88416662-c07f-4d9f-b9cb-7f92d21aaa6f/galera/0.log" Dec 08 21:11:24 crc kubenswrapper[4781]: I1208 21:11:24.479482 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ee967571-7083-4ff7-a035-b90fadf420ee/openstackclient/0.log" Dec 08 21:11:24 crc kubenswrapper[4781]: I1208 21:11:24.584650 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-kf4ss_a7f5afd4-05f3-4954-9dc9-3efa47c22b85/ovn-controller/0.log" Dec 08 21:11:24 crc kubenswrapper[4781]: I1208 21:11:24.718248 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2348d949-d7f2-44a5-8e47-58b358d060c8/nova-metadata-metadata/0.log" Dec 08 21:11:24 crc kubenswrapper[4781]: I1208 21:11:24.754836 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9w8ql_0d8cf2d7-e85f-49e8-95e2-c1548f506888/openstack-network-exporter/0.log" Dec 08 21:11:24 crc kubenswrapper[4781]: I1208 21:11:24.897568 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9cv7c_b675bf96-ecb2-4098-891f-6a87e0ed5140/ovsdb-server-init/0.log" Dec 08 21:11:25 crc kubenswrapper[4781]: I1208 21:11:25.089386 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9cv7c_b675bf96-ecb2-4098-891f-6a87e0ed5140/ovs-vswitchd/0.log" Dec 08 21:11:25 crc kubenswrapper[4781]: I1208 21:11:25.129819 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9cv7c_b675bf96-ecb2-4098-891f-6a87e0ed5140/ovsdb-server/0.log" Dec 08 21:11:25 crc kubenswrapper[4781]: I1208 21:11:25.134888 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9cv7c_b675bf96-ecb2-4098-891f-6a87e0ed5140/ovsdb-server-init/0.log" Dec 08 21:11:25 crc kubenswrapper[4781]: I1208 21:11:25.333286 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-4wchf_8f4acc2b-373a-48e5-916b-a0fcfcb83851/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:25 crc kubenswrapper[4781]: I1208 21:11:25.359874 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a1bbb8af-58b4-4eff-9e81-5206ecc06b2e/openstack-network-exporter/0.log" Dec 08 21:11:25 crc kubenswrapper[4781]: I1208 21:11:25.453574 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_a1bbb8af-58b4-4eff-9e81-5206ecc06b2e/ovn-northd/0.log" Dec 08 21:11:26 crc kubenswrapper[4781]: I1208 21:11:26.060190 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3530fc96-f407-470c-a960-c7cfd844c517/openstack-network-exporter/0.log" Dec 08 21:11:26 crc kubenswrapper[4781]: I1208 21:11:26.102824 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3530fc96-f407-470c-a960-c7cfd844c517/ovsdbserver-nb/0.log" Dec 08 21:11:26 crc kubenswrapper[4781]: I1208 21:11:26.285740 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5544d7c5-67c2-4f2e-9e0f-d8307d831d5d/ovsdbserver-sb/0.log" Dec 08 21:11:26 crc kubenswrapper[4781]: I1208 21:11:26.389983 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_5544d7c5-67c2-4f2e-9e0f-d8307d831d5d/openstack-network-exporter/0.log" Dec 08 21:11:26 crc kubenswrapper[4781]: I1208 21:11:26.460068 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7cb6b5bcfd-8jc9n_684ebfa2-2a23-4f1f-96cc-d436e63feede/placement-api/0.log" Dec 08 21:11:26 crc kubenswrapper[4781]: I1208 21:11:26.568362 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7cb6b5bcfd-8jc9n_684ebfa2-2a23-4f1f-96cc-d436e63feede/placement-log/0.log" Dec 08 21:11:26 crc kubenswrapper[4781]: I1208 21:11:26.631228 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_984eb37b-647a-4e37-b4bc-6e7a3becb3ce/setup-container/0.log" Dec 08 21:11:26 crc kubenswrapper[4781]: I1208 21:11:26.867201 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_20fc56e1-a1f6-4495-834a-41bfebf14aef/setup-container/0.log" Dec 08 21:11:26 crc kubenswrapper[4781]: I1208 21:11:26.874586 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_984eb37b-647a-4e37-b4bc-6e7a3becb3ce/rabbitmq/0.log" Dec 08 21:11:26 crc kubenswrapper[4781]: I1208 21:11:26.921646 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_984eb37b-647a-4e37-b4bc-6e7a3becb3ce/setup-container/0.log" Dec 08 21:11:27 crc kubenswrapper[4781]: I1208 21:11:27.134654 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_20fc56e1-a1f6-4495-834a-41bfebf14aef/setup-container/0.log" Dec 08 21:11:27 crc kubenswrapper[4781]: I1208 21:11:27.188744 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_20fc56e1-a1f6-4495-834a-41bfebf14aef/rabbitmq/0.log" Dec 08 21:11:27 crc kubenswrapper[4781]: I1208 21:11:27.215226 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-rhvxx_29315241-935b-40dc-b49d-d8f18cbb4d38/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:27 crc kubenswrapper[4781]: I1208 21:11:27.881704 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-nbjfw_793b9e15-75d8-49b1-8261-cc624d33aaea/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:27 crc kubenswrapper[4781]: I1208 21:11:27.892928 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-jr5wd_3cfbf369-bbb7-4b9e-980d-32fe2cf76c39/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:28 crc kubenswrapper[4781]: I1208 21:11:28.099698 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-xxkqq_96db8e7d-bc3a-4804-af50-6f403dbbcc26/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:28 crc kubenswrapper[4781]: I1208 21:11:28.274837 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-b4xf8_857a2e5e-21b6-450e-8578-240e91f419f7/ssh-known-hosts-edpm-deployment/0.log" Dec 08 21:11:28 crc kubenswrapper[4781]: I1208 21:11:28.467429 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7dc689b5b9-6r5rd_d6925de4-2c7d-43cb-b2a9-1ec66f56b007/proxy-server/0.log" Dec 08 21:11:28 crc kubenswrapper[4781]: I1208 21:11:28.549729 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7dc689b5b9-6r5rd_d6925de4-2c7d-43cb-b2a9-1ec66f56b007/proxy-httpd/0.log" Dec 08 21:11:28 crc kubenswrapper[4781]: I1208 21:11:28.582831 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-cdbxl_ec594735-a472-4c13-b98b-453a80fceb1d/swift-ring-rebalance/0.log" Dec 08 21:11:28 crc kubenswrapper[4781]: I1208 21:11:28.798529 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/account-auditor/0.log" Dec 08 21:11:28 crc kubenswrapper[4781]: I1208 21:11:28.800414 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/account-reaper/0.log" Dec 08 21:11:28 crc kubenswrapper[4781]: I1208 21:11:28.815738 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/account-replicator/0.log" Dec 08 21:11:28 crc kubenswrapper[4781]: I1208 21:11:28.914639 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/account-server/0.log" Dec 08 21:11:28 crc kubenswrapper[4781]: I1208 21:11:28.984358 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/container-auditor/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.013844 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/container-replicator/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.021248 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/container-server/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.140650 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/container-updater/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.167675 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/object-auditor/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.202522 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/object-expirer/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.226214 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/object-replicator/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.330794 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/object-server/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.357599 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/rsync/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.379852 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/object-updater/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.415955 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_75ab4f11-7508-4813-83bd-05ef029af585/swift-recon-cron/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.592301 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-kpqhf_9c03b281-e533-4108-9eff-0930b52141ca/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.681509 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e2b97db1-1e2a-45e9-b959-fde154131c3b/tempest-tests-tempest-tests-runner/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.874967 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_71062a6c-6abb-416f-9c98-a98fb20ad2ad/test-operator-logs-container/0.log" Dec 08 21:11:29 crc kubenswrapper[4781]: I1208 21:11:29.906358 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-x7nv7_e9261e8f-a212-4633-bc8d-06c952d3dc9f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 08 21:11:31 crc kubenswrapper[4781]: I1208 21:11:31.126453 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:11:31 crc kubenswrapper[4781]: E1208 21:11:31.127799 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:11:38 crc kubenswrapper[4781]: I1208 21:11:38.676129 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3be01047-1cc6-4ed4-9d41-68b1f67f7a11/memcached/0.log" Dec 08 21:11:46 crc kubenswrapper[4781]: I1208 21:11:46.190284 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:11:46 crc kubenswrapper[4781]: E1208 21:11:46.191158 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:11:56 crc kubenswrapper[4781]: I1208 21:11:56.474385 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/util/0.log" Dec 08 21:11:56 crc kubenswrapper[4781]: I1208 21:11:56.649440 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/util/0.log" Dec 08 21:11:56 crc kubenswrapper[4781]: I1208 21:11:56.697782 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/pull/0.log" Dec 08 21:11:56 crc kubenswrapper[4781]: I1208 21:11:56.702171 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/pull/0.log" Dec 08 21:11:56 crc kubenswrapper[4781]: I1208 21:11:56.867235 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/pull/0.log" Dec 08 21:11:56 crc kubenswrapper[4781]: I1208 21:11:56.973139 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/extract/0.log" Dec 08 21:11:56 crc kubenswrapper[4781]: I1208 21:11:56.988515 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b9120eefd1323238be49f2f5c049f678a31f5be734312c37cc3d85ecabgkkf2_5a0379ef-fb54-4ec0-b240-fbe55e702817/util/0.log" Dec 08 21:11:57 crc kubenswrapper[4781]: I1208 21:11:57.144103 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-spkzc_33445aa2-3e5b-4b50-ba7a-0f86d08dd64d/kube-rbac-proxy/0.log" Dec 08 21:11:57 crc kubenswrapper[4781]: I1208 21:11:57.175085 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-tk48g_17686434-377d-4a2f-b25e-e0074d2e06c6/kube-rbac-proxy/0.log" Dec 08 21:11:57 crc kubenswrapper[4781]: I1208 21:11:57.183640 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-spkzc_33445aa2-3e5b-4b50-ba7a-0f86d08dd64d/manager/0.log" Dec 08 21:11:57 crc kubenswrapper[4781]: I1208 21:11:57.357698 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6c677c69b-tk48g_17686434-377d-4a2f-b25e-e0074d2e06c6/manager/0.log" Dec 08 21:11:57 crc kubenswrapper[4781]: I1208 21:11:57.378022 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-cg5z2_c1ed3c21-a6cb-43f6-a018-8bead69a5439/kube-rbac-proxy/0.log" Dec 08 21:11:57 crc kubenswrapper[4781]: I1208 21:11:57.402697 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-697fb699cf-cg5z2_c1ed3c21-a6cb-43f6-a018-8bead69a5439/manager/0.log" Dec 08 21:11:57 crc kubenswrapper[4781]: I1208 21:11:57.542850 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-64kcb_b44e4d42-05a5-42e2-8a45-5d5506fbbb23/kube-rbac-proxy/0.log" Dec 08 21:11:57 crc kubenswrapper[4781]: I1208 21:11:57.629457 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5697bb5779-64kcb_b44e4d42-05a5-42e2-8a45-5d5506fbbb23/manager/0.log" Dec 08 21:11:57 crc kubenswrapper[4781]: I1208 21:11:57.743476 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-7dhp5_5e629e44-f8b6-410a-baa9-b076e609686c/kube-rbac-proxy/0.log" Dec 08 21:11:57 crc kubenswrapper[4781]: I1208 21:11:57.757365 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-7dhp5_5e629e44-f8b6-410a-baa9-b076e609686c/manager/0.log" Dec 08 21:11:57 crc kubenswrapper[4781]: I1208 21:11:57.798267 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-bkrvn_76fef842-95b4-47f1-9c34-a4edc70a3cbf/kube-rbac-proxy/0.log" Dec 08 21:11:58 crc kubenswrapper[4781]: I1208 21:11:58.626574 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-5mbrr_47e01596-c50b-44f5-82fb-1b6c7a005d10/kube-rbac-proxy/0.log" Dec 08 21:11:58 crc kubenswrapper[4781]: I1208 21:11:58.646633 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-bkrvn_76fef842-95b4-47f1-9c34-a4edc70a3cbf/manager/0.log" Dec 08 21:11:58 crc kubenswrapper[4781]: I1208 21:11:58.903174 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-cv24b_97f1308f-60a9-4e7f-b029-8bc13246ba9e/kube-rbac-proxy/0.log" Dec 08 21:11:58 crc kubenswrapper[4781]: I1208 21:11:58.908782 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-78d48bff9d-5mbrr_47e01596-c50b-44f5-82fb-1b6c7a005d10/manager/0.log" Dec 08 21:11:58 crc kubenswrapper[4781]: I1208 21:11:58.909297 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-967d97867-cv24b_97f1308f-60a9-4e7f-b029-8bc13246ba9e/manager/0.log" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.050946 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-fjhrn_344dd244-42be-4538-92c0-ab4be8f8a093/kube-rbac-proxy/0.log" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.127738 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:11:59 crc kubenswrapper[4781]: E1208 21:11:59.128020 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.182163 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-fjhrn_344dd244-42be-4538-92c0-ab4be8f8a093/manager/0.log" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.280250 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-swkcz_51e5598f-4979-4f4d-a947-323c19dd3102/kube-rbac-proxy/0.log" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.287733 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5b5fd79c9c-swkcz_51e5598f-4979-4f4d-a947-323c19dd3102/manager/0.log" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.308897 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-m9tns_2c1ab608-2b41-451a-b6b9-2cf867ab289b/kube-rbac-proxy/0.log" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.402509 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-79c8c4686c-m9tns_2c1ab608-2b41-451a-b6b9-2cf867ab289b/manager/0.log" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.503617 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-zkslx_3b159a78-5c17-430a-ac90-f1d4e7fac757/kube-rbac-proxy/0.log" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.513400 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-zkslx_3b159a78-5c17-430a-ac90-f1d4e7fac757/manager/0.log" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.593195 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p7f7t_e00e8983-d123-42e8-a4ef-2a2bbda78cde/kube-rbac-proxy/0.log" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.714072 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-p7f7t_e00e8983-d123-42e8-a4ef-2a2bbda78cde/manager/0.log" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.741349 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-fql79_78753ed6-755b-4e63-8026-50722a9637a9/kube-rbac-proxy/0.log" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.859839 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-fql79_78753ed6-755b-4e63-8026-50722a9637a9/manager/0.log" Dec 08 21:11:59 crc kubenswrapper[4781]: I1208 21:11:59.989498 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-744f8cb766hj2j5_0bafa66c-7cf8-40eb-ae15-a4365fbe3176/kube-rbac-proxy/0.log" Dec 08 21:12:00 crc kubenswrapper[4781]: I1208 21:12:00.066448 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-744f8cb766hj2j5_0bafa66c-7cf8-40eb-ae15-a4365fbe3176/manager/0.log" Dec 08 21:12:00 crc kubenswrapper[4781]: I1208 21:12:00.378603 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-924xv_fa455afd-33f9-4f97-9eb8-838444176453/registry-server/0.log" Dec 08 21:12:00 crc kubenswrapper[4781]: I1208 21:12:00.425967 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5f48db4cb9-gl7xk_e3c328d0-5fc0-4900-b90b-0b89bf486395/operator/0.log" Dec 08 21:12:00 crc kubenswrapper[4781]: I1208 21:12:00.515981 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-dnh9j_b5739409-2e10-4acf-8088-99608fc2f489/kube-rbac-proxy/0.log" Dec 08 21:12:00 crc kubenswrapper[4781]: I1208 21:12:00.681576 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-dnh9j_b5739409-2e10-4acf-8088-99608fc2f489/manager/0.log" Dec 08 21:12:00 crc kubenswrapper[4781]: I1208 21:12:00.723223 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-rdrrr_b86b0a68-3152-46ac-8bde-3bfd32c6fbf2/kube-rbac-proxy/0.log" Dec 08 21:12:00 crc kubenswrapper[4781]: I1208 21:12:00.763690 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-rdrrr_b86b0a68-3152-46ac-8bde-3bfd32c6fbf2/manager/0.log" Dec 08 21:12:00 crc kubenswrapper[4781]: I1208 21:12:00.967958 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-p26p9_60bdaea6-28dd-4dac-b1b8-a046ea0c90e0/operator/0.log" Dec 08 21:12:01 crc kubenswrapper[4781]: I1208 21:12:01.059343 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-24k5k_9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e/kube-rbac-proxy/0.log" Dec 08 21:12:01 crc kubenswrapper[4781]: I1208 21:12:01.161857 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9d58d64bc-24k5k_9bdbb305-c9d1-47d0-b6c4-b4d4da8a8a6e/manager/0.log" Dec 08 21:12:01 crc kubenswrapper[4781]: I1208 21:12:01.322676 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-j5vws_7f697324-2ce1-4c81-89c8-9cc53bac7062/kube-rbac-proxy/0.log" Dec 08 21:12:01 crc kubenswrapper[4781]: I1208 21:12:01.331737 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b47684954-hcdhf_0f03e65d-f0b8-4cfa-90bd-4a70de607c2d/manager/0.log" Dec 08 21:12:01 crc kubenswrapper[4781]: I1208 21:12:01.420904 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-58d5ff84df-j5vws_7f697324-2ce1-4c81-89c8-9cc53bac7062/manager/0.log" Dec 08 21:12:01 crc kubenswrapper[4781]: I1208 21:12:01.550441 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-2b4gp_908c60a3-663f-4a4a-9223-fbcff50de2b9/kube-rbac-proxy/0.log" Dec 08 21:12:01 crc kubenswrapper[4781]: I1208 21:12:01.589943 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-fcn46_db28fdab-dab1-4d5c-9447-c895523b0985/kube-rbac-proxy/0.log" Dec 08 21:12:01 crc kubenswrapper[4781]: I1208 21:12:01.590741 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-2b4gp_908c60a3-663f-4a4a-9223-fbcff50de2b9/manager/0.log" Dec 08 21:12:01 crc kubenswrapper[4781]: I1208 21:12:01.649374 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-667bd8d554-fcn46_db28fdab-dab1-4d5c-9447-c895523b0985/manager/0.log" Dec 08 21:12:12 crc kubenswrapper[4781]: I1208 21:12:12.166653 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:12:12 crc kubenswrapper[4781]: E1208 21:12:12.167602 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:12:17 crc kubenswrapper[4781]: I1208 21:12:17.777744 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vl5k6"] Dec 08 21:12:17 crc kubenswrapper[4781]: E1208 21:12:17.779111 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d433b3-df5d-41af-9ecc-07c55cfcf706" containerName="container-00" Dec 08 21:12:17 crc kubenswrapper[4781]: I1208 21:12:17.779145 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d433b3-df5d-41af-9ecc-07c55cfcf706" containerName="container-00" Dec 08 21:12:17 crc kubenswrapper[4781]: I1208 21:12:17.779509 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d433b3-df5d-41af-9ecc-07c55cfcf706" containerName="container-00" Dec 08 21:12:17 crc kubenswrapper[4781]: I1208 21:12:17.781850 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:17 crc kubenswrapper[4781]: I1208 21:12:17.800078 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vl5k6"] Dec 08 21:12:17 crc kubenswrapper[4781]: I1208 21:12:17.822433 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-catalog-content\") pod \"certified-operators-vl5k6\" (UID: \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\") " pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:17 crc kubenswrapper[4781]: I1208 21:12:17.822520 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-utilities\") pod \"certified-operators-vl5k6\" (UID: \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\") " pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:17 crc kubenswrapper[4781]: I1208 21:12:17.822553 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98fxh\" (UniqueName: \"kubernetes.io/projected/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-kube-api-access-98fxh\") pod \"certified-operators-vl5k6\" (UID: \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\") " pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:17 crc kubenswrapper[4781]: I1208 21:12:17.923723 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-catalog-content\") pod \"certified-operators-vl5k6\" (UID: \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\") " pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:17 crc kubenswrapper[4781]: I1208 21:12:17.923822 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-utilities\") pod \"certified-operators-vl5k6\" (UID: \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\") " pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:17 crc kubenswrapper[4781]: I1208 21:12:17.923855 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98fxh\" (UniqueName: \"kubernetes.io/projected/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-kube-api-access-98fxh\") pod \"certified-operators-vl5k6\" (UID: \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\") " pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:17 crc kubenswrapper[4781]: I1208 21:12:17.924338 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-catalog-content\") pod \"certified-operators-vl5k6\" (UID: \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\") " pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:17 crc kubenswrapper[4781]: I1208 21:12:17.924474 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-utilities\") pod \"certified-operators-vl5k6\" (UID: \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\") " pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:18 crc kubenswrapper[4781]: I1208 21:12:18.059478 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98fxh\" (UniqueName: \"kubernetes.io/projected/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-kube-api-access-98fxh\") pod \"certified-operators-vl5k6\" (UID: \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\") " pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:18 crc kubenswrapper[4781]: I1208 21:12:18.132660 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:18 crc kubenswrapper[4781]: I1208 21:12:18.656885 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vl5k6"] Dec 08 21:12:19 crc kubenswrapper[4781]: I1208 21:12:19.600661 4781 generic.go:334] "Generic (PLEG): container finished" podID="2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" containerID="da51ef0c19c379b150f5e90e74e991935b388f43fea8bde683d730fd8d591886" exitCode=0 Dec 08 21:12:19 crc kubenswrapper[4781]: I1208 21:12:19.600832 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vl5k6" event={"ID":"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9","Type":"ContainerDied","Data":"da51ef0c19c379b150f5e90e74e991935b388f43fea8bde683d730fd8d591886"} Dec 08 21:12:19 crc kubenswrapper[4781]: I1208 21:12:19.601288 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vl5k6" event={"ID":"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9","Type":"ContainerStarted","Data":"d7cd2263e0a366b2eb7d8fac8d3f29554cecc420d848b474a45c43da46236907"} Dec 08 21:12:20 crc kubenswrapper[4781]: I1208 21:12:20.482791 4781 trace.go:236] Trace[1260373196]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-4s86h" (08-Dec-2025 21:12:19.447) (total time: 1035ms): Dec 08 21:12:20 crc kubenswrapper[4781]: Trace[1260373196]: [1.035221019s] [1.035221019s] END Dec 08 21:12:21 crc kubenswrapper[4781]: I1208 21:12:21.623410 4781 generic.go:334] "Generic (PLEG): container finished" podID="2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" containerID="a2a8a09d0b2b9af361d6bfe213f21e31d425ee68524eceecef0364b0cbbdac2f" exitCode=0 Dec 08 21:12:21 crc kubenswrapper[4781]: I1208 21:12:21.623514 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vl5k6" event={"ID":"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9","Type":"ContainerDied","Data":"a2a8a09d0b2b9af361d6bfe213f21e31d425ee68524eceecef0364b0cbbdac2f"} Dec 08 21:12:22 crc kubenswrapper[4781]: I1208 21:12:22.837660 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vl5k6" event={"ID":"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9","Type":"ContainerStarted","Data":"d3ebb9383fd03c2f3a9974ba01ac6b903fb09c9dcd7d811589d3599caa508ac6"} Dec 08 21:12:22 crc kubenswrapper[4781]: I1208 21:12:22.857068 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vl5k6" podStartSLOduration=3.452806455 podStartE2EDuration="5.85705208s" podCreationTimestamp="2025-12-08 21:12:17 +0000 UTC" firstStartedPulling="2025-12-08 21:12:19.606784807 +0000 UTC m=+4055.758068204" lastFinishedPulling="2025-12-08 21:12:22.011030452 +0000 UTC m=+4058.162313829" observedRunningTime="2025-12-08 21:12:22.852428067 +0000 UTC m=+4059.003711444" watchObservedRunningTime="2025-12-08 21:12:22.85705208 +0000 UTC m=+4059.008335457" Dec 08 21:12:24 crc kubenswrapper[4781]: I1208 21:12:24.976150 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-29wgh_a9fba569-4fe5-44f1-905c-c0e844f64fca/control-plane-machine-set-operator/0.log" Dec 08 21:12:25 crc kubenswrapper[4781]: I1208 21:12:25.126224 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:12:25 crc kubenswrapper[4781]: E1208 21:12:25.126507 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:12:25 crc kubenswrapper[4781]: I1208 21:12:25.140364 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lx5mt_513ea4da-405c-4176-adbd-c8e5f68c631c/machine-api-operator/0.log" Dec 08 21:12:25 crc kubenswrapper[4781]: I1208 21:12:25.162863 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lx5mt_513ea4da-405c-4176-adbd-c8e5f68c631c/kube-rbac-proxy/0.log" Dec 08 21:12:28 crc kubenswrapper[4781]: I1208 21:12:28.140520 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:28 crc kubenswrapper[4781]: I1208 21:12:28.141054 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:28 crc kubenswrapper[4781]: I1208 21:12:28.185900 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:28 crc kubenswrapper[4781]: I1208 21:12:28.930979 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:28 crc kubenswrapper[4781]: I1208 21:12:28.997567 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vl5k6"] Dec 08 21:12:30 crc kubenswrapper[4781]: I1208 21:12:30.915658 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vl5k6" podUID="2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" containerName="registry-server" containerID="cri-o://d3ebb9383fd03c2f3a9974ba01ac6b903fb09c9dcd7d811589d3599caa508ac6" gracePeriod=2 Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.438091 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.473724 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98fxh\" (UniqueName: \"kubernetes.io/projected/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-kube-api-access-98fxh\") pod \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\" (UID: \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\") " Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.473963 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-catalog-content\") pod \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\" (UID: \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\") " Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.474045 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-utilities\") pod \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\" (UID: \"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9\") " Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.475493 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-utilities" (OuterVolumeSpecName: "utilities") pod "2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" (UID: "2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.483733 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-kube-api-access-98fxh" (OuterVolumeSpecName: "kube-api-access-98fxh") pod "2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" (UID: "2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9"). InnerVolumeSpecName "kube-api-access-98fxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.534190 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" (UID: "2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.575643 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.575676 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98fxh\" (UniqueName: \"kubernetes.io/projected/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-kube-api-access-98fxh\") on node \"crc\" DevicePath \"\"" Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.575689 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.926939 4781 generic.go:334] "Generic (PLEG): container finished" podID="2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" containerID="d3ebb9383fd03c2f3a9974ba01ac6b903fb09c9dcd7d811589d3599caa508ac6" exitCode=0 Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.926994 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vl5k6" event={"ID":"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9","Type":"ContainerDied","Data":"d3ebb9383fd03c2f3a9974ba01ac6b903fb09c9dcd7d811589d3599caa508ac6"} Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.927037 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vl5k6" event={"ID":"2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9","Type":"ContainerDied","Data":"d7cd2263e0a366b2eb7d8fac8d3f29554cecc420d848b474a45c43da46236907"} Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.926994 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vl5k6" Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.927060 4781 scope.go:117] "RemoveContainer" containerID="d3ebb9383fd03c2f3a9974ba01ac6b903fb09c9dcd7d811589d3599caa508ac6" Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.974099 4781 scope.go:117] "RemoveContainer" containerID="a2a8a09d0b2b9af361d6bfe213f21e31d425ee68524eceecef0364b0cbbdac2f" Dec 08 21:12:31 crc kubenswrapper[4781]: I1208 21:12:31.979144 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vl5k6"] Dec 08 21:12:32 crc kubenswrapper[4781]: I1208 21:12:32.006088 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vl5k6"] Dec 08 21:12:32 crc kubenswrapper[4781]: I1208 21:12:32.006101 4781 scope.go:117] "RemoveContainer" containerID="da51ef0c19c379b150f5e90e74e991935b388f43fea8bde683d730fd8d591886" Dec 08 21:12:32 crc kubenswrapper[4781]: I1208 21:12:32.043313 4781 scope.go:117] "RemoveContainer" containerID="d3ebb9383fd03c2f3a9974ba01ac6b903fb09c9dcd7d811589d3599caa508ac6" Dec 08 21:12:32 crc kubenswrapper[4781]: E1208 21:12:32.043766 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3ebb9383fd03c2f3a9974ba01ac6b903fb09c9dcd7d811589d3599caa508ac6\": container with ID starting with d3ebb9383fd03c2f3a9974ba01ac6b903fb09c9dcd7d811589d3599caa508ac6 not found: ID does not exist" containerID="d3ebb9383fd03c2f3a9974ba01ac6b903fb09c9dcd7d811589d3599caa508ac6" Dec 08 21:12:32 crc kubenswrapper[4781]: I1208 21:12:32.043829 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ebb9383fd03c2f3a9974ba01ac6b903fb09c9dcd7d811589d3599caa508ac6"} err="failed to get container status \"d3ebb9383fd03c2f3a9974ba01ac6b903fb09c9dcd7d811589d3599caa508ac6\": rpc error: code = NotFound desc = could not find container \"d3ebb9383fd03c2f3a9974ba01ac6b903fb09c9dcd7d811589d3599caa508ac6\": container with ID starting with d3ebb9383fd03c2f3a9974ba01ac6b903fb09c9dcd7d811589d3599caa508ac6 not found: ID does not exist" Dec 08 21:12:32 crc kubenswrapper[4781]: I1208 21:12:32.043868 4781 scope.go:117] "RemoveContainer" containerID="a2a8a09d0b2b9af361d6bfe213f21e31d425ee68524eceecef0364b0cbbdac2f" Dec 08 21:12:32 crc kubenswrapper[4781]: E1208 21:12:32.044316 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2a8a09d0b2b9af361d6bfe213f21e31d425ee68524eceecef0364b0cbbdac2f\": container with ID starting with a2a8a09d0b2b9af361d6bfe213f21e31d425ee68524eceecef0364b0cbbdac2f not found: ID does not exist" containerID="a2a8a09d0b2b9af361d6bfe213f21e31d425ee68524eceecef0364b0cbbdac2f" Dec 08 21:12:32 crc kubenswrapper[4781]: I1208 21:12:32.044364 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2a8a09d0b2b9af361d6bfe213f21e31d425ee68524eceecef0364b0cbbdac2f"} err="failed to get container status \"a2a8a09d0b2b9af361d6bfe213f21e31d425ee68524eceecef0364b0cbbdac2f\": rpc error: code = NotFound desc = could not find container \"a2a8a09d0b2b9af361d6bfe213f21e31d425ee68524eceecef0364b0cbbdac2f\": container with ID starting with a2a8a09d0b2b9af361d6bfe213f21e31d425ee68524eceecef0364b0cbbdac2f not found: ID does not exist" Dec 08 21:12:32 crc kubenswrapper[4781]: I1208 21:12:32.044389 4781 scope.go:117] "RemoveContainer" containerID="da51ef0c19c379b150f5e90e74e991935b388f43fea8bde683d730fd8d591886" Dec 08 21:12:32 crc kubenswrapper[4781]: E1208 21:12:32.044649 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da51ef0c19c379b150f5e90e74e991935b388f43fea8bde683d730fd8d591886\": container with ID starting with da51ef0c19c379b150f5e90e74e991935b388f43fea8bde683d730fd8d591886 not found: ID does not exist" containerID="da51ef0c19c379b150f5e90e74e991935b388f43fea8bde683d730fd8d591886" Dec 08 21:12:32 crc kubenswrapper[4781]: I1208 21:12:32.044680 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da51ef0c19c379b150f5e90e74e991935b388f43fea8bde683d730fd8d591886"} err="failed to get container status \"da51ef0c19c379b150f5e90e74e991935b388f43fea8bde683d730fd8d591886\": rpc error: code = NotFound desc = could not find container \"da51ef0c19c379b150f5e90e74e991935b388f43fea8bde683d730fd8d591886\": container with ID starting with da51ef0c19c379b150f5e90e74e991935b388f43fea8bde683d730fd8d591886 not found: ID does not exist" Dec 08 21:12:32 crc kubenswrapper[4781]: I1208 21:12:32.137450 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" path="/var/lib/kubelet/pods/2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9/volumes" Dec 08 21:12:36 crc kubenswrapper[4781]: I1208 21:12:36.128517 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:12:36 crc kubenswrapper[4781]: E1208 21:12:36.131304 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:12:39 crc kubenswrapper[4781]: I1208 21:12:39.267395 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-nfw2s_b83e9dad-39a3-4cce-aaaf-4f3ecc8a9e2d/cert-manager-controller/0.log" Dec 08 21:12:39 crc kubenswrapper[4781]: I1208 21:12:39.365907 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-t8kb7_579553a7-31d3-4b32-98fd-03e631e208d4/cert-manager-cainjector/0.log" Dec 08 21:12:39 crc kubenswrapper[4781]: I1208 21:12:39.432590 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-cpdhc_a8d7f2da-3db7-4daf-afd5-4d3984932e2d/cert-manager-webhook/0.log" Dec 08 21:12:48 crc kubenswrapper[4781]: I1208 21:12:48.125282 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:12:48 crc kubenswrapper[4781]: E1208 21:12:48.125949 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:12:52 crc kubenswrapper[4781]: I1208 21:12:52.622525 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-bsh4c_26857532-cde7-49e1-924f-eda2b362b6b7/nmstate-console-plugin/0.log" Dec 08 21:12:52 crc kubenswrapper[4781]: I1208 21:12:52.803965 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-777jl_74d0c70c-fc95-4b78-855f-96eb01f08c07/nmstate-handler/0.log" Dec 08 21:12:52 crc kubenswrapper[4781]: I1208 21:12:52.806620 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qjn2k_ff52b8c4-e459-42ed-96a1-dead7cdcd8b9/kube-rbac-proxy/0.log" Dec 08 21:12:52 crc kubenswrapper[4781]: I1208 21:12:52.922015 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-qjn2k_ff52b8c4-e459-42ed-96a1-dead7cdcd8b9/nmstate-metrics/0.log" Dec 08 21:12:53 crc kubenswrapper[4781]: I1208 21:12:53.023983 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-66p75_0e3edf9d-cddf-4b23-bda2-930fb5cbaf27/nmstate-operator/0.log" Dec 08 21:12:53 crc kubenswrapper[4781]: I1208 21:12:53.122602 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-2m6pl_f547b46e-1a15-4d7c-a3c6-0167927eb75c/nmstate-webhook/0.log" Dec 08 21:13:02 crc kubenswrapper[4781]: I1208 21:13:02.126635 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:13:02 crc kubenswrapper[4781]: E1208 21:13:02.127581 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:13:09 crc kubenswrapper[4781]: I1208 21:13:09.826400 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-k4scp_ee2c128a-e154-4685-957c-1cd19b86f113/kube-rbac-proxy/0.log" Dec 08 21:13:09 crc kubenswrapper[4781]: I1208 21:13:09.935070 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-k4scp_ee2c128a-e154-4685-957c-1cd19b86f113/controller/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.091488 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-frr-files/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.258162 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-frr-files/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.300817 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-metrics/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.316439 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-reloader/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.316518 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-reloader/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.478319 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-metrics/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.488033 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-reloader/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.499094 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-metrics/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.504825 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-frr-files/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.653493 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-frr-files/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.679933 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-reloader/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.721020 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/controller/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.724303 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/cp-metrics/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.906431 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/frr-metrics/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.932498 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/kube-rbac-proxy/0.log" Dec 08 21:13:10 crc kubenswrapper[4781]: I1208 21:13:10.980439 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/kube-rbac-proxy-frr/0.log" Dec 08 21:13:11 crc kubenswrapper[4781]: I1208 21:13:11.160240 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/reloader/0.log" Dec 08 21:13:11 crc kubenswrapper[4781]: I1208 21:13:11.245650 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-lwsps_17dc730c-3861-434c-aeed-de6287a1c55b/frr-k8s-webhook-server/0.log" Dec 08 21:13:11 crc kubenswrapper[4781]: I1208 21:13:11.463680 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-79f7dffd6f-8vls5_dd0edabf-1168-40aa-b197-c87637432272/manager/0.log" Dec 08 21:13:11 crc kubenswrapper[4781]: I1208 21:13:11.584163 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-dff497d76-fmghx_f195b37d-be68-434b-8295-23a8208108b8/webhook-server/0.log" Dec 08 21:13:11 crc kubenswrapper[4781]: I1208 21:13:11.796825 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b4s7x_95f89c56-40b6-4d4b-8060-154674b55a18/kube-rbac-proxy/0.log" Dec 08 21:13:12 crc kubenswrapper[4781]: I1208 21:13:12.143152 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-982hg_8bf0dbea-2341-414b-89fc-e36f1150ee8e/frr/0.log" Dec 08 21:13:12 crc kubenswrapper[4781]: I1208 21:13:12.163047 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b4s7x_95f89c56-40b6-4d4b-8060-154674b55a18/speaker/0.log" Dec 08 21:13:15 crc kubenswrapper[4781]: I1208 21:13:15.126489 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:13:15 crc kubenswrapper[4781]: E1208 21:13:15.127319 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:13:26 crc kubenswrapper[4781]: I1208 21:13:26.058105 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/util/0.log" Dec 08 21:13:26 crc kubenswrapper[4781]: I1208 21:13:26.126471 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:13:26 crc kubenswrapper[4781]: E1208 21:13:26.126704 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:13:26 crc kubenswrapper[4781]: I1208 21:13:26.205790 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/util/0.log" Dec 08 21:13:26 crc kubenswrapper[4781]: I1208 21:13:26.254026 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/pull/0.log" Dec 08 21:13:26 crc kubenswrapper[4781]: I1208 21:13:26.273061 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/pull/0.log" Dec 08 21:13:26 crc kubenswrapper[4781]: I1208 21:13:26.449162 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/pull/0.log" Dec 08 21:13:26 crc kubenswrapper[4781]: I1208 21:13:26.464405 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/extract/0.log" Dec 08 21:13:26 crc kubenswrapper[4781]: I1208 21:13:26.467992 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f85jm9_7006c3a5-ac34-49f3-81f9-25a51e4a5e9f/util/0.log" Dec 08 21:13:27 crc kubenswrapper[4781]: I1208 21:13:27.137874 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/util/0.log" Dec 08 21:13:27 crc kubenswrapper[4781]: I1208 21:13:27.305433 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/util/0.log" Dec 08 21:13:27 crc kubenswrapper[4781]: I1208 21:13:27.338242 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/pull/0.log" Dec 08 21:13:27 crc kubenswrapper[4781]: I1208 21:13:27.354584 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/pull/0.log" Dec 08 21:13:27 crc kubenswrapper[4781]: I1208 21:13:27.526585 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/pull/0.log" Dec 08 21:13:27 crc kubenswrapper[4781]: I1208 21:13:27.526776 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/extract/0.log" Dec 08 21:13:27 crc kubenswrapper[4781]: I1208 21:13:27.549891 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83fg629_e4836df2-487a-4750-b2fc-28c2fd4394f5/util/0.log" Dec 08 21:13:27 crc kubenswrapper[4781]: I1208 21:13:27.692501 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/extract-utilities/0.log" Dec 08 21:13:27 crc kubenswrapper[4781]: I1208 21:13:27.875933 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/extract-content/0.log" Dec 08 21:13:27 crc kubenswrapper[4781]: I1208 21:13:27.888119 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/extract-content/0.log" Dec 08 21:13:27 crc kubenswrapper[4781]: I1208 21:13:27.889144 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/extract-utilities/0.log" Dec 08 21:13:28 crc kubenswrapper[4781]: I1208 21:13:28.028355 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/extract-utilities/0.log" Dec 08 21:13:28 crc kubenswrapper[4781]: I1208 21:13:28.076079 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/extract-content/0.log" Dec 08 21:13:28 crc kubenswrapper[4781]: I1208 21:13:28.255126 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/extract-utilities/0.log" Dec 08 21:13:28 crc kubenswrapper[4781]: I1208 21:13:28.542049 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-57nnf_70aa58ee-7ab9-4cd6-b0b2-6b0f6fac9b81/registry-server/0.log" Dec 08 21:13:28 crc kubenswrapper[4781]: I1208 21:13:28.549546 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/extract-utilities/0.log" Dec 08 21:13:28 crc kubenswrapper[4781]: I1208 21:13:28.550133 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/extract-content/0.log" Dec 08 21:13:28 crc kubenswrapper[4781]: I1208 21:13:28.576525 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/extract-content/0.log" Dec 08 21:13:28 crc kubenswrapper[4781]: I1208 21:13:28.767084 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/extract-content/0.log" Dec 08 21:13:28 crc kubenswrapper[4781]: I1208 21:13:28.772070 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/extract-utilities/0.log" Dec 08 21:13:29 crc kubenswrapper[4781]: I1208 21:13:29.030241 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-w8dwv_81d63467-e009-4c3d-8391-9f034f2da751/marketplace-operator/0.log" Dec 08 21:13:29 crc kubenswrapper[4781]: I1208 21:13:29.068812 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/extract-utilities/0.log" Dec 08 21:13:29 crc kubenswrapper[4781]: I1208 21:13:29.315136 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/extract-content/0.log" Dec 08 21:13:29 crc kubenswrapper[4781]: I1208 21:13:29.315178 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/extract-content/0.log" Dec 08 21:13:29 crc kubenswrapper[4781]: I1208 21:13:29.339649 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/extract-utilities/0.log" Dec 08 21:13:29 crc kubenswrapper[4781]: I1208 21:13:29.446031 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mh8gd_cdb95fae-ff11-445b-afc5-d1b040c9bff9/registry-server/0.log" Dec 08 21:13:29 crc kubenswrapper[4781]: I1208 21:13:29.530070 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/extract-utilities/0.log" Dec 08 21:13:29 crc kubenswrapper[4781]: I1208 21:13:29.546464 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/extract-content/0.log" Dec 08 21:13:29 crc kubenswrapper[4781]: I1208 21:13:29.706035 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-lpj2j_9ac429e7-ae4c-4229-9c70-508f5ad917c8/registry-server/0.log" Dec 08 21:13:29 crc kubenswrapper[4781]: I1208 21:13:29.717048 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/extract-utilities/0.log" Dec 08 21:13:29 crc kubenswrapper[4781]: I1208 21:13:29.895403 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/extract-utilities/0.log" Dec 08 21:13:29 crc kubenswrapper[4781]: I1208 21:13:29.910551 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/extract-content/0.log" Dec 08 21:13:29 crc kubenswrapper[4781]: I1208 21:13:29.919190 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/extract-content/0.log" Dec 08 21:13:30 crc kubenswrapper[4781]: I1208 21:13:30.073561 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/extract-utilities/0.log" Dec 08 21:13:30 crc kubenswrapper[4781]: I1208 21:13:30.095349 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/extract-content/0.log" Dec 08 21:13:30 crc kubenswrapper[4781]: I1208 21:13:30.714259 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bqh9n_db9d24ac-95dc-4952-bef0-ad643de86795/registry-server/0.log" Dec 08 21:13:40 crc kubenswrapper[4781]: I1208 21:13:40.126506 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:13:40 crc kubenswrapper[4781]: E1208 21:13:40.127790 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:13:55 crc kubenswrapper[4781]: I1208 21:13:55.126012 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:13:55 crc kubenswrapper[4781]: E1208 21:13:55.126856 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:14:06 crc kubenswrapper[4781]: I1208 21:14:06.126121 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:14:06 crc kubenswrapper[4781]: E1208 21:14:06.126743 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:14:20 crc kubenswrapper[4781]: I1208 21:14:20.125729 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:14:20 crc kubenswrapper[4781]: E1208 21:14:20.126571 4781 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kr4pr_openshift-machine-config-operator(7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8)\"" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" Dec 08 21:14:33 crc kubenswrapper[4781]: I1208 21:14:33.126522 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619" Dec 08 21:14:33 crc kubenswrapper[4781]: I1208 21:14:33.407631 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"6225f60cf7d87f188af4fb0d8c492938aa7969632b8bd3413c32343313caa58a"} Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.206902 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4"] Dec 08 21:15:00 crc kubenswrapper[4781]: E1208 21:15:00.208058 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" containerName="extract-content" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.208076 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" containerName="extract-content" Dec 08 21:15:00 crc kubenswrapper[4781]: E1208 21:15:00.208094 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" containerName="registry-server" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.208103 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" containerName="registry-server" Dec 08 21:15:00 crc kubenswrapper[4781]: E1208 21:15:00.208124 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" containerName="extract-utilities" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.208134 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" containerName="extract-utilities" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.208377 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc8c0d1-a52f-4e13-b1b5-cc2ed38466f9" containerName="registry-server" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.209144 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.210982 4781 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.211258 4781 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.218993 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4"] Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.337289 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrqvt\" (UniqueName: \"kubernetes.io/projected/73e29f65-74a5-4e2c-94bb-d7d763ee184c-kube-api-access-wrqvt\") pod \"collect-profiles-29420475-thtw4\" (UID: \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.337382 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73e29f65-74a5-4e2c-94bb-d7d763ee184c-secret-volume\") pod \"collect-profiles-29420475-thtw4\" (UID: \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.337467 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73e29f65-74a5-4e2c-94bb-d7d763ee184c-config-volume\") pod \"collect-profiles-29420475-thtw4\" (UID: \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.438861 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73e29f65-74a5-4e2c-94bb-d7d763ee184c-config-volume\") pod \"collect-profiles-29420475-thtw4\" (UID: \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.438966 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrqvt\" (UniqueName: \"kubernetes.io/projected/73e29f65-74a5-4e2c-94bb-d7d763ee184c-kube-api-access-wrqvt\") pod \"collect-profiles-29420475-thtw4\" (UID: \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.439013 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73e29f65-74a5-4e2c-94bb-d7d763ee184c-secret-volume\") pod \"collect-profiles-29420475-thtw4\" (UID: \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.440704 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73e29f65-74a5-4e2c-94bb-d7d763ee184c-config-volume\") pod \"collect-profiles-29420475-thtw4\" (UID: \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.559817 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73e29f65-74a5-4e2c-94bb-d7d763ee184c-secret-volume\") pod \"collect-profiles-29420475-thtw4\" (UID: \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.560612 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrqvt\" (UniqueName: \"kubernetes.io/projected/73e29f65-74a5-4e2c-94bb-d7d763ee184c-kube-api-access-wrqvt\") pod \"collect-profiles-29420475-thtw4\" (UID: \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" Dec 08 21:15:00 crc kubenswrapper[4781]: I1208 21:15:00.835254 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" Dec 08 21:15:01 crc kubenswrapper[4781]: I1208 21:15:01.323721 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4"] Dec 08 21:15:01 crc kubenswrapper[4781]: I1208 21:15:01.716347 4781 generic.go:334] "Generic (PLEG): container finished" podID="73e29f65-74a5-4e2c-94bb-d7d763ee184c" containerID="404d15dbfb2af813fcaccd4c7cc334183822b86105f3c8ddba16d8807583e8d1" exitCode=0 Dec 08 21:15:01 crc kubenswrapper[4781]: I1208 21:15:01.716412 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" event={"ID":"73e29f65-74a5-4e2c-94bb-d7d763ee184c","Type":"ContainerDied","Data":"404d15dbfb2af813fcaccd4c7cc334183822b86105f3c8ddba16d8807583e8d1"} Dec 08 21:15:01 crc kubenswrapper[4781]: I1208 21:15:01.716724 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" event={"ID":"73e29f65-74a5-4e2c-94bb-d7d763ee184c","Type":"ContainerStarted","Data":"ca248d3b3a83c554b5734bb2b3634ada066cb3be63cdb5c9f5083d5437a7bf5c"} Dec 08 21:15:03 crc kubenswrapper[4781]: I1208 21:15:03.106072 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" Dec 08 21:15:03 crc kubenswrapper[4781]: I1208 21:15:03.193422 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrqvt\" (UniqueName: \"kubernetes.io/projected/73e29f65-74a5-4e2c-94bb-d7d763ee184c-kube-api-access-wrqvt\") pod \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\" (UID: \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\") " Dec 08 21:15:03 crc kubenswrapper[4781]: I1208 21:15:03.193580 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73e29f65-74a5-4e2c-94bb-d7d763ee184c-config-volume\") pod \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\" (UID: \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\") " Dec 08 21:15:03 crc kubenswrapper[4781]: I1208 21:15:03.193748 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73e29f65-74a5-4e2c-94bb-d7d763ee184c-secret-volume\") pod \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\" (UID: \"73e29f65-74a5-4e2c-94bb-d7d763ee184c\") " Dec 08 21:15:03 crc kubenswrapper[4781]: I1208 21:15:03.197804 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73e29f65-74a5-4e2c-94bb-d7d763ee184c-config-volume" (OuterVolumeSpecName: "config-volume") pod "73e29f65-74a5-4e2c-94bb-d7d763ee184c" (UID: "73e29f65-74a5-4e2c-94bb-d7d763ee184c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 08 21:15:03 crc kubenswrapper[4781]: I1208 21:15:03.202554 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e29f65-74a5-4e2c-94bb-d7d763ee184c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "73e29f65-74a5-4e2c-94bb-d7d763ee184c" (UID: "73e29f65-74a5-4e2c-94bb-d7d763ee184c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 08 21:15:03 crc kubenswrapper[4781]: I1208 21:15:03.207781 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73e29f65-74a5-4e2c-94bb-d7d763ee184c-kube-api-access-wrqvt" (OuterVolumeSpecName: "kube-api-access-wrqvt") pod "73e29f65-74a5-4e2c-94bb-d7d763ee184c" (UID: "73e29f65-74a5-4e2c-94bb-d7d763ee184c"). InnerVolumeSpecName "kube-api-access-wrqvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:15:03 crc kubenswrapper[4781]: I1208 21:15:03.296611 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrqvt\" (UniqueName: \"kubernetes.io/projected/73e29f65-74a5-4e2c-94bb-d7d763ee184c-kube-api-access-wrqvt\") on node \"crc\" DevicePath \"\"" Dec 08 21:15:03 crc kubenswrapper[4781]: I1208 21:15:03.296652 4781 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/73e29f65-74a5-4e2c-94bb-d7d763ee184c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 08 21:15:03 crc kubenswrapper[4781]: I1208 21:15:03.296663 4781 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/73e29f65-74a5-4e2c-94bb-d7d763ee184c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 08 21:15:03 crc kubenswrapper[4781]: I1208 21:15:03.743712 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" event={"ID":"73e29f65-74a5-4e2c-94bb-d7d763ee184c","Type":"ContainerDied","Data":"ca248d3b3a83c554b5734bb2b3634ada066cb3be63cdb5c9f5083d5437a7bf5c"} Dec 08 21:15:03 crc kubenswrapper[4781]: I1208 21:15:03.744267 4781 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca248d3b3a83c554b5734bb2b3634ada066cb3be63cdb5c9f5083d5437a7bf5c" Dec 08 21:15:03 crc kubenswrapper[4781]: I1208 21:15:03.743779 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29420475-thtw4" Dec 08 21:15:04 crc kubenswrapper[4781]: I1208 21:15:04.188889 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb"] Dec 08 21:15:04 crc kubenswrapper[4781]: I1208 21:15:04.196666 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29420430-4s5rb"] Dec 08 21:15:06 crc kubenswrapper[4781]: I1208 21:15:06.158189 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1df355b-c158-4578-ae10-0690aa3cf69c" path="/var/lib/kubelet/pods/e1df355b-c158-4578-ae10-0690aa3cf69c/volumes" Dec 08 21:15:14 crc kubenswrapper[4781]: I1208 21:15:14.735655 4781 scope.go:117] "RemoveContainer" containerID="c00812a3f1376b90211151589226efc95081902ddf66f14532a003a949794817" Dec 08 21:15:14 crc kubenswrapper[4781]: I1208 21:15:14.860646 4781 generic.go:334] "Generic (PLEG): container finished" podID="653dbb9a-6eca-4e85-8073-9cf00a7f1346" containerID="411ffc3a29c1712b50a9566cb0f658db2cf68758e48a85a0e29e8480d8fbb1f6" exitCode=0 Dec 08 21:15:14 crc kubenswrapper[4781]: I1208 21:15:14.860702 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-455h5/must-gather-zk4ld" event={"ID":"653dbb9a-6eca-4e85-8073-9cf00a7f1346","Type":"ContainerDied","Data":"411ffc3a29c1712b50a9566cb0f658db2cf68758e48a85a0e29e8480d8fbb1f6"} Dec 08 21:15:14 crc kubenswrapper[4781]: I1208 21:15:14.861406 4781 scope.go:117] "RemoveContainer" containerID="411ffc3a29c1712b50a9566cb0f658db2cf68758e48a85a0e29e8480d8fbb1f6" Dec 08 21:15:14 crc kubenswrapper[4781]: I1208 21:15:14.934832 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-455h5_must-gather-zk4ld_653dbb9a-6eca-4e85-8073-9cf00a7f1346/gather/0.log" Dec 08 21:15:24 crc kubenswrapper[4781]: I1208 21:15:24.787535 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-455h5/must-gather-zk4ld"] Dec 08 21:15:24 crc kubenswrapper[4781]: I1208 21:15:24.788548 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-455h5/must-gather-zk4ld" podUID="653dbb9a-6eca-4e85-8073-9cf00a7f1346" containerName="copy" containerID="cri-o://41b2704a97035b881b64fa48a2d8d38152cd8893d9d99a9491416f9c51c8ffb6" gracePeriod=2 Dec 08 21:15:24 crc kubenswrapper[4781]: I1208 21:15:24.798659 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-455h5/must-gather-zk4ld"] Dec 08 21:15:24 crc kubenswrapper[4781]: I1208 21:15:24.969936 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-455h5_must-gather-zk4ld_653dbb9a-6eca-4e85-8073-9cf00a7f1346/copy/0.log" Dec 08 21:15:24 crc kubenswrapper[4781]: I1208 21:15:24.970862 4781 generic.go:334] "Generic (PLEG): container finished" podID="653dbb9a-6eca-4e85-8073-9cf00a7f1346" containerID="41b2704a97035b881b64fa48a2d8d38152cd8893d9d99a9491416f9c51c8ffb6" exitCode=143 Dec 08 21:15:25 crc kubenswrapper[4781]: I1208 21:15:25.231863 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-455h5_must-gather-zk4ld_653dbb9a-6eca-4e85-8073-9cf00a7f1346/copy/0.log" Dec 08 21:15:25 crc kubenswrapper[4781]: I1208 21:15:25.233315 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/must-gather-zk4ld" Dec 08 21:15:25 crc kubenswrapper[4781]: I1208 21:15:25.366044 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6xxf\" (UniqueName: \"kubernetes.io/projected/653dbb9a-6eca-4e85-8073-9cf00a7f1346-kube-api-access-r6xxf\") pod \"653dbb9a-6eca-4e85-8073-9cf00a7f1346\" (UID: \"653dbb9a-6eca-4e85-8073-9cf00a7f1346\") " Dec 08 21:15:25 crc kubenswrapper[4781]: I1208 21:15:25.366249 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/653dbb9a-6eca-4e85-8073-9cf00a7f1346-must-gather-output\") pod \"653dbb9a-6eca-4e85-8073-9cf00a7f1346\" (UID: \"653dbb9a-6eca-4e85-8073-9cf00a7f1346\") " Dec 08 21:15:25 crc kubenswrapper[4781]: I1208 21:15:25.531140 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/653dbb9a-6eca-4e85-8073-9cf00a7f1346-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "653dbb9a-6eca-4e85-8073-9cf00a7f1346" (UID: "653dbb9a-6eca-4e85-8073-9cf00a7f1346"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:15:25 crc kubenswrapper[4781]: I1208 21:15:25.570420 4781 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/653dbb9a-6eca-4e85-8073-9cf00a7f1346-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 08 21:15:25 crc kubenswrapper[4781]: I1208 21:15:25.859304 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/653dbb9a-6eca-4e85-8073-9cf00a7f1346-kube-api-access-r6xxf" (OuterVolumeSpecName: "kube-api-access-r6xxf") pod "653dbb9a-6eca-4e85-8073-9cf00a7f1346" (UID: "653dbb9a-6eca-4e85-8073-9cf00a7f1346"). InnerVolumeSpecName "kube-api-access-r6xxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:15:25 crc kubenswrapper[4781]: I1208 21:15:25.879559 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6xxf\" (UniqueName: \"kubernetes.io/projected/653dbb9a-6eca-4e85-8073-9cf00a7f1346-kube-api-access-r6xxf\") on node \"crc\" DevicePath \"\"" Dec 08 21:15:25 crc kubenswrapper[4781]: I1208 21:15:25.992469 4781 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-455h5_must-gather-zk4ld_653dbb9a-6eca-4e85-8073-9cf00a7f1346/copy/0.log" Dec 08 21:15:25 crc kubenswrapper[4781]: I1208 21:15:25.993519 4781 scope.go:117] "RemoveContainer" containerID="41b2704a97035b881b64fa48a2d8d38152cd8893d9d99a9491416f9c51c8ffb6" Dec 08 21:15:25 crc kubenswrapper[4781]: I1208 21:15:25.993593 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-455h5/must-gather-zk4ld" Dec 08 21:15:26 crc kubenswrapper[4781]: I1208 21:15:26.015301 4781 scope.go:117] "RemoveContainer" containerID="411ffc3a29c1712b50a9566cb0f658db2cf68758e48a85a0e29e8480d8fbb1f6" Dec 08 21:15:26 crc kubenswrapper[4781]: I1208 21:15:26.139858 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="653dbb9a-6eca-4e85-8073-9cf00a7f1346" path="/var/lib/kubelet/pods/653dbb9a-6eca-4e85-8073-9cf00a7f1346/volumes" Dec 08 21:16:14 crc kubenswrapper[4781]: I1208 21:16:14.829071 4781 scope.go:117] "RemoveContainer" containerID="e5f3756745fb4b475b8612b1169575c0dc68f5651ff2e5840c24683e978d473b" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.701701 4781 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w5zf8"] Dec 08 21:16:15 crc kubenswrapper[4781]: E1208 21:16:15.703159 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653dbb9a-6eca-4e85-8073-9cf00a7f1346" containerName="copy" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.703199 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="653dbb9a-6eca-4e85-8073-9cf00a7f1346" containerName="copy" Dec 08 21:16:15 crc kubenswrapper[4781]: E1208 21:16:15.703238 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="653dbb9a-6eca-4e85-8073-9cf00a7f1346" containerName="gather" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.703252 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="653dbb9a-6eca-4e85-8073-9cf00a7f1346" containerName="gather" Dec 08 21:16:15 crc kubenswrapper[4781]: E1208 21:16:15.703275 4781 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e29f65-74a5-4e2c-94bb-d7d763ee184c" containerName="collect-profiles" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.703293 4781 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e29f65-74a5-4e2c-94bb-d7d763ee184c" containerName="collect-profiles" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.703648 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e29f65-74a5-4e2c-94bb-d7d763ee184c" containerName="collect-profiles" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.703754 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="653dbb9a-6eca-4e85-8073-9cf00a7f1346" containerName="copy" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.703791 4781 memory_manager.go:354] "RemoveStaleState removing state" podUID="653dbb9a-6eca-4e85-8073-9cf00a7f1346" containerName="gather" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.706705 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.711888 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5zf8"] Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.835815 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8ldm\" (UniqueName: \"kubernetes.io/projected/da52dfcb-90ef-4452-9570-452eebac5252-kube-api-access-n8ldm\") pod \"redhat-operators-w5zf8\" (UID: \"da52dfcb-90ef-4452-9570-452eebac5252\") " pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.835965 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da52dfcb-90ef-4452-9570-452eebac5252-utilities\") pod \"redhat-operators-w5zf8\" (UID: \"da52dfcb-90ef-4452-9570-452eebac5252\") " pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.836029 4781 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da52dfcb-90ef-4452-9570-452eebac5252-catalog-content\") pod \"redhat-operators-w5zf8\" (UID: \"da52dfcb-90ef-4452-9570-452eebac5252\") " pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.937465 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da52dfcb-90ef-4452-9570-452eebac5252-catalog-content\") pod \"redhat-operators-w5zf8\" (UID: \"da52dfcb-90ef-4452-9570-452eebac5252\") " pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.937834 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8ldm\" (UniqueName: \"kubernetes.io/projected/da52dfcb-90ef-4452-9570-452eebac5252-kube-api-access-n8ldm\") pod \"redhat-operators-w5zf8\" (UID: \"da52dfcb-90ef-4452-9570-452eebac5252\") " pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.937984 4781 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da52dfcb-90ef-4452-9570-452eebac5252-utilities\") pod \"redhat-operators-w5zf8\" (UID: \"da52dfcb-90ef-4452-9570-452eebac5252\") " pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.938206 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da52dfcb-90ef-4452-9570-452eebac5252-catalog-content\") pod \"redhat-operators-w5zf8\" (UID: \"da52dfcb-90ef-4452-9570-452eebac5252\") " pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.938510 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da52dfcb-90ef-4452-9570-452eebac5252-utilities\") pod \"redhat-operators-w5zf8\" (UID: \"da52dfcb-90ef-4452-9570-452eebac5252\") " pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:15 crc kubenswrapper[4781]: I1208 21:16:15.958749 4781 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8ldm\" (UniqueName: \"kubernetes.io/projected/da52dfcb-90ef-4452-9570-452eebac5252-kube-api-access-n8ldm\") pod \"redhat-operators-w5zf8\" (UID: \"da52dfcb-90ef-4452-9570-452eebac5252\") " pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:16 crc kubenswrapper[4781]: I1208 21:16:16.043792 4781 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:16 crc kubenswrapper[4781]: I1208 21:16:16.549933 4781 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w5zf8"] Dec 08 21:16:16 crc kubenswrapper[4781]: I1208 21:16:16.564611 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5zf8" event={"ID":"da52dfcb-90ef-4452-9570-452eebac5252","Type":"ContainerStarted","Data":"be4673206da02d697f14aca37f55a32d442f60e557d52a672c5f64cdc5dba730"} Dec 08 21:16:17 crc kubenswrapper[4781]: I1208 21:16:17.578626 4781 generic.go:334] "Generic (PLEG): container finished" podID="da52dfcb-90ef-4452-9570-452eebac5252" containerID="ea435aaa8805aa3e200ed86f587a5c35da2b04d172ca1be55bbd32b1d49fbc55" exitCode=0 Dec 08 21:16:17 crc kubenswrapper[4781]: I1208 21:16:17.578731 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5zf8" event={"ID":"da52dfcb-90ef-4452-9570-452eebac5252","Type":"ContainerDied","Data":"ea435aaa8805aa3e200ed86f587a5c35da2b04d172ca1be55bbd32b1d49fbc55"} Dec 08 21:16:17 crc kubenswrapper[4781]: I1208 21:16:17.582440 4781 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 08 21:16:19 crc kubenswrapper[4781]: E1208 21:16:19.489015 4781 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda52dfcb_90ef_4452_9570_452eebac5252.slice/crio-conmon-6b43696ad393b9b695d3b72e1eb2e00e287dc5cedd989f515d5a6e19f863fee5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda52dfcb_90ef_4452_9570_452eebac5252.slice/crio-6b43696ad393b9b695d3b72e1eb2e00e287dc5cedd989f515d5a6e19f863fee5.scope\": RecentStats: unable to find data in memory cache]" Dec 08 21:16:19 crc kubenswrapper[4781]: I1208 21:16:19.618895 4781 generic.go:334] "Generic (PLEG): container finished" podID="da52dfcb-90ef-4452-9570-452eebac5252" containerID="6b43696ad393b9b695d3b72e1eb2e00e287dc5cedd989f515d5a6e19f863fee5" exitCode=0 Dec 08 21:16:19 crc kubenswrapper[4781]: I1208 21:16:19.619063 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5zf8" event={"ID":"da52dfcb-90ef-4452-9570-452eebac5252","Type":"ContainerDied","Data":"6b43696ad393b9b695d3b72e1eb2e00e287dc5cedd989f515d5a6e19f863fee5"} Dec 08 21:16:20 crc kubenswrapper[4781]: I1208 21:16:20.631300 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5zf8" event={"ID":"da52dfcb-90ef-4452-9570-452eebac5252","Type":"ContainerStarted","Data":"d85a0f4c8b0aadc868d76e0ecd288d3ea475e32f815ee36068b2d71f07f956f6"} Dec 08 21:16:20 crc kubenswrapper[4781]: I1208 21:16:20.656802 4781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w5zf8" podStartSLOduration=3.183434354 podStartE2EDuration="5.656774923s" podCreationTimestamp="2025-12-08 21:16:15 +0000 UTC" firstStartedPulling="2025-12-08 21:16:17.581845715 +0000 UTC m=+4293.733129162" lastFinishedPulling="2025-12-08 21:16:20.055186354 +0000 UTC m=+4296.206469731" observedRunningTime="2025-12-08 21:16:20.653826448 +0000 UTC m=+4296.805109825" watchObservedRunningTime="2025-12-08 21:16:20.656774923 +0000 UTC m=+4296.808058310" Dec 08 21:16:26 crc kubenswrapper[4781]: I1208 21:16:26.044843 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:26 crc kubenswrapper[4781]: I1208 21:16:26.045366 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:26 crc kubenswrapper[4781]: I1208 21:16:26.095299 4781 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:26 crc kubenswrapper[4781]: I1208 21:16:26.732609 4781 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:26 crc kubenswrapper[4781]: I1208 21:16:26.783440 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5zf8"] Dec 08 21:16:28 crc kubenswrapper[4781]: I1208 21:16:28.702461 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w5zf8" podUID="da52dfcb-90ef-4452-9570-452eebac5252" containerName="registry-server" containerID="cri-o://d85a0f4c8b0aadc868d76e0ecd288d3ea475e32f815ee36068b2d71f07f956f6" gracePeriod=2 Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.277429 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.350157 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8ldm\" (UniqueName: \"kubernetes.io/projected/da52dfcb-90ef-4452-9570-452eebac5252-kube-api-access-n8ldm\") pod \"da52dfcb-90ef-4452-9570-452eebac5252\" (UID: \"da52dfcb-90ef-4452-9570-452eebac5252\") " Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.350261 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da52dfcb-90ef-4452-9570-452eebac5252-utilities\") pod \"da52dfcb-90ef-4452-9570-452eebac5252\" (UID: \"da52dfcb-90ef-4452-9570-452eebac5252\") " Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.350311 4781 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da52dfcb-90ef-4452-9570-452eebac5252-catalog-content\") pod \"da52dfcb-90ef-4452-9570-452eebac5252\" (UID: \"da52dfcb-90ef-4452-9570-452eebac5252\") " Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.351540 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da52dfcb-90ef-4452-9570-452eebac5252-utilities" (OuterVolumeSpecName: "utilities") pod "da52dfcb-90ef-4452-9570-452eebac5252" (UID: "da52dfcb-90ef-4452-9570-452eebac5252"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.357042 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da52dfcb-90ef-4452-9570-452eebac5252-kube-api-access-n8ldm" (OuterVolumeSpecName: "kube-api-access-n8ldm") pod "da52dfcb-90ef-4452-9570-452eebac5252" (UID: "da52dfcb-90ef-4452-9570-452eebac5252"). InnerVolumeSpecName "kube-api-access-n8ldm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.452981 4781 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8ldm\" (UniqueName: \"kubernetes.io/projected/da52dfcb-90ef-4452-9570-452eebac5252-kube-api-access-n8ldm\") on node \"crc\" DevicePath \"\"" Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.453024 4781 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da52dfcb-90ef-4452-9570-452eebac5252-utilities\") on node \"crc\" DevicePath \"\"" Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.724243 4781 generic.go:334] "Generic (PLEG): container finished" podID="da52dfcb-90ef-4452-9570-452eebac5252" containerID="d85a0f4c8b0aadc868d76e0ecd288d3ea475e32f815ee36068b2d71f07f956f6" exitCode=0 Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.724375 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5zf8" event={"ID":"da52dfcb-90ef-4452-9570-452eebac5252","Type":"ContainerDied","Data":"d85a0f4c8b0aadc868d76e0ecd288d3ea475e32f815ee36068b2d71f07f956f6"} Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.725433 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w5zf8" event={"ID":"da52dfcb-90ef-4452-9570-452eebac5252","Type":"ContainerDied","Data":"be4673206da02d697f14aca37f55a32d442f60e557d52a672c5f64cdc5dba730"} Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.725499 4781 scope.go:117] "RemoveContainer" containerID="d85a0f4c8b0aadc868d76e0ecd288d3ea475e32f815ee36068b2d71f07f956f6" Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.724597 4781 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w5zf8" Dec 08 21:16:29 crc kubenswrapper[4781]: I1208 21:16:29.756858 4781 scope.go:117] "RemoveContainer" containerID="6b43696ad393b9b695d3b72e1eb2e00e287dc5cedd989f515d5a6e19f863fee5" Dec 08 21:16:30 crc kubenswrapper[4781]: I1208 21:16:30.291676 4781 scope.go:117] "RemoveContainer" containerID="ea435aaa8805aa3e200ed86f587a5c35da2b04d172ca1be55bbd32b1d49fbc55" Dec 08 21:16:30 crc kubenswrapper[4781]: I1208 21:16:30.352863 4781 scope.go:117] "RemoveContainer" containerID="d85a0f4c8b0aadc868d76e0ecd288d3ea475e32f815ee36068b2d71f07f956f6" Dec 08 21:16:30 crc kubenswrapper[4781]: E1208 21:16:30.353446 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85a0f4c8b0aadc868d76e0ecd288d3ea475e32f815ee36068b2d71f07f956f6\": container with ID starting with d85a0f4c8b0aadc868d76e0ecd288d3ea475e32f815ee36068b2d71f07f956f6 not found: ID does not exist" containerID="d85a0f4c8b0aadc868d76e0ecd288d3ea475e32f815ee36068b2d71f07f956f6" Dec 08 21:16:30 crc kubenswrapper[4781]: I1208 21:16:30.353502 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85a0f4c8b0aadc868d76e0ecd288d3ea475e32f815ee36068b2d71f07f956f6"} err="failed to get container status \"d85a0f4c8b0aadc868d76e0ecd288d3ea475e32f815ee36068b2d71f07f956f6\": rpc error: code = NotFound desc = could not find container \"d85a0f4c8b0aadc868d76e0ecd288d3ea475e32f815ee36068b2d71f07f956f6\": container with ID starting with d85a0f4c8b0aadc868d76e0ecd288d3ea475e32f815ee36068b2d71f07f956f6 not found: ID does not exist" Dec 08 21:16:30 crc kubenswrapper[4781]: I1208 21:16:30.353528 4781 scope.go:117] "RemoveContainer" containerID="6b43696ad393b9b695d3b72e1eb2e00e287dc5cedd989f515d5a6e19f863fee5" Dec 08 21:16:30 crc kubenswrapper[4781]: E1208 21:16:30.353846 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b43696ad393b9b695d3b72e1eb2e00e287dc5cedd989f515d5a6e19f863fee5\": container with ID starting with 6b43696ad393b9b695d3b72e1eb2e00e287dc5cedd989f515d5a6e19f863fee5 not found: ID does not exist" containerID="6b43696ad393b9b695d3b72e1eb2e00e287dc5cedd989f515d5a6e19f863fee5" Dec 08 21:16:30 crc kubenswrapper[4781]: I1208 21:16:30.353878 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b43696ad393b9b695d3b72e1eb2e00e287dc5cedd989f515d5a6e19f863fee5"} err="failed to get container status \"6b43696ad393b9b695d3b72e1eb2e00e287dc5cedd989f515d5a6e19f863fee5\": rpc error: code = NotFound desc = could not find container \"6b43696ad393b9b695d3b72e1eb2e00e287dc5cedd989f515d5a6e19f863fee5\": container with ID starting with 6b43696ad393b9b695d3b72e1eb2e00e287dc5cedd989f515d5a6e19f863fee5 not found: ID does not exist" Dec 08 21:16:30 crc kubenswrapper[4781]: I1208 21:16:30.353892 4781 scope.go:117] "RemoveContainer" containerID="ea435aaa8805aa3e200ed86f587a5c35da2b04d172ca1be55bbd32b1d49fbc55" Dec 08 21:16:30 crc kubenswrapper[4781]: E1208 21:16:30.354236 4781 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea435aaa8805aa3e200ed86f587a5c35da2b04d172ca1be55bbd32b1d49fbc55\": container with ID starting with ea435aaa8805aa3e200ed86f587a5c35da2b04d172ca1be55bbd32b1d49fbc55 not found: ID does not exist" containerID="ea435aaa8805aa3e200ed86f587a5c35da2b04d172ca1be55bbd32b1d49fbc55" Dec 08 21:16:30 crc kubenswrapper[4781]: I1208 21:16:30.354303 4781 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea435aaa8805aa3e200ed86f587a5c35da2b04d172ca1be55bbd32b1d49fbc55"} err="failed to get container status \"ea435aaa8805aa3e200ed86f587a5c35da2b04d172ca1be55bbd32b1d49fbc55\": rpc error: code = NotFound desc = could not find container \"ea435aaa8805aa3e200ed86f587a5c35da2b04d172ca1be55bbd32b1d49fbc55\": container with ID starting with ea435aaa8805aa3e200ed86f587a5c35da2b04d172ca1be55bbd32b1d49fbc55 not found: ID does not exist" Dec 08 21:16:30 crc kubenswrapper[4781]: I1208 21:16:30.671600 4781 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da52dfcb-90ef-4452-9570-452eebac5252-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da52dfcb-90ef-4452-9570-452eebac5252" (UID: "da52dfcb-90ef-4452-9570-452eebac5252"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 08 21:16:30 crc kubenswrapper[4781]: I1208 21:16:30.680911 4781 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da52dfcb-90ef-4452-9570-452eebac5252-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 08 21:16:30 crc kubenswrapper[4781]: I1208 21:16:30.966438 4781 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w5zf8"] Dec 08 21:16:30 crc kubenswrapper[4781]: I1208 21:16:30.976805 4781 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w5zf8"] Dec 08 21:16:32 crc kubenswrapper[4781]: I1208 21:16:32.140985 4781 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da52dfcb-90ef-4452-9570-452eebac5252" path="/var/lib/kubelet/pods/da52dfcb-90ef-4452-9570-452eebac5252/volumes" Dec 08 21:16:59 crc kubenswrapper[4781]: I1208 21:16:59.948022 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 21:16:59 crc kubenswrapper[4781]: I1208 21:16:59.948867 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 21:17:14 crc kubenswrapper[4781]: I1208 21:17:14.926046 4781 scope.go:117] "RemoveContainer" containerID="355f85da88b0a9d1f3e9ea064045626ea25c0742115190839bdfab8965e3facb" Dec 08 21:17:14 crc kubenswrapper[4781]: I1208 21:17:14.962281 4781 scope.go:117] "RemoveContainer" containerID="c2f0e83e2db9c80d414426b38a05c999e7d1cc109275b1216b836e8b6d15efe8" Dec 08 21:17:15 crc kubenswrapper[4781]: I1208 21:17:15.035886 4781 scope.go:117] "RemoveContainer" containerID="ba1830a5065da7fd6fc35bdf8e50bda7a058760e1623aab89eca71f95faa39da" Dec 08 21:17:15 crc kubenswrapper[4781]: I1208 21:17:15.108329 4781 scope.go:117] "RemoveContainer" containerID="7d05f698ced240742e084fe4192aa9d7fd2c165dd3e00d30c36125beb4a86398" Dec 08 21:17:29 crc kubenswrapper[4781]: I1208 21:17:29.948155 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 21:17:29 crc kubenswrapper[4781]: I1208 21:17:29.948812 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 21:17:59 crc kubenswrapper[4781]: I1208 21:17:59.947957 4781 patch_prober.go:28] interesting pod/machine-config-daemon-kr4pr container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 08 21:17:59 crc kubenswrapper[4781]: I1208 21:17:59.948746 4781 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 08 21:17:59 crc kubenswrapper[4781]: I1208 21:17:59.948816 4781 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" Dec 08 21:17:59 crc kubenswrapper[4781]: I1208 21:17:59.949885 4781 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6225f60cf7d87f188af4fb0d8c492938aa7969632b8bd3413c32343313caa58a"} pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 08 21:17:59 crc kubenswrapper[4781]: I1208 21:17:59.950053 4781 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" podUID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerName="machine-config-daemon" containerID="cri-o://6225f60cf7d87f188af4fb0d8c492938aa7969632b8bd3413c32343313caa58a" gracePeriod=600 Dec 08 21:18:00 crc kubenswrapper[4781]: I1208 21:18:00.782997 4781 generic.go:334] "Generic (PLEG): container finished" podID="7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8" containerID="6225f60cf7d87f188af4fb0d8c492938aa7969632b8bd3413c32343313caa58a" exitCode=0 Dec 08 21:18:00 crc kubenswrapper[4781]: I1208 21:18:00.783044 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerDied","Data":"6225f60cf7d87f188af4fb0d8c492938aa7969632b8bd3413c32343313caa58a"} Dec 08 21:18:00 crc kubenswrapper[4781]: I1208 21:18:00.783311 4781 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kr4pr" event={"ID":"7f7ef82f-3bfa-4ebf-a7f0-b2d00fbcc6a8","Type":"ContainerStarted","Data":"93c4029fa363ae0452d8f3ce2ba231006c3e287702694fee7375aeab7323fc13"} Dec 08 21:18:00 crc kubenswrapper[4781]: I1208 21:18:00.783339 4781 scope.go:117] "RemoveContainer" containerID="9cf3f79c4fbb1a9a4c3fb459019f6d37fbc7c4a2b53822adc8f484368ed0d619"